Calling all Alteryx customers: Refer your colleague to try Alteryx and receive a $50 gift card. Sign up now!
Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Working with Big Query...pulling hair out

warrencowan
9 - Comet

Wanted to get started with Big Query today...7 hours later and wow am I still struggling. Nothing works or behaves like its documented

 

Input
Got the simba drivers (2.4.6.1015) , but creating the simba DSN couldn't be validated by user sign in, as google seems to have abandoned this method.

Finally got it working by getting the service account key file and using the service auth method instead.

The DCM let me create a big query connection using simba, and which actually let me see my table in the input tool,

but it wouldnt work because it was missing 'a refresh token'??? Couldnt find any support on that.

So then tried creating a normal big query connection in the dcm using reg ODBC option whcih looks like it finally works... but takes ages.

Is this ok, or am I doing this sub standard. I always worry when the documented steps dont work and you try something hackish and it does.

 

Output
Then went to create a bulk export connection to big query, and none of the options work.

Got my project and bucket details right but after that...
...tried using the sign-in mthods, but they dont work.

so then tried the keyfile approach with file location of service account key I registered in the DSN, and that didnt work either.

Just keeps coming up with missing credential when i try to link.
I've tried using the same odbc in the export, but it takes 5 minutes to write 100 rows, so really wnat to make th bulk process work.

 

Nothing in the documentation is staright forward or works with this. Tools are deprecated, documentation out of date, google moving the goal posts on what will and wont work.

 

What gives? Anyone had more success with this.

 

5 REPLIES 5
apathetichell
19 - Altair

Google says use a version after 2.52 - https://cloud.google.com/bigquery/docs/reference/odbc-jdbc-drivers 

 

TBH - I've had luck with this but haven't looked at it in 18 months or so. refresh token sounds like it's execpting a redirect url for an oauth flow and not finding one.

warrencowan
9 - Comet

Thnx @apathetichell , the new drivers seemed to make authentication alot simpler.

I'm still having problems using the bulk output though.

 

I go through all the motions only to get to this. Any idea what I am seeing.

 

alteryx BQ error.png

 

apathetichell
19 - Altair

you see that when you hit create and link - right? canyou click on the allow connection for sdk button and retry? I'd strongly recommend writing to GCS bucket if possible and then having BQ transform/build tables off of GCS.

warrencowan
9 - Comet

Yes thats correct. I've tried it with the sdk check box ticked ans unticked but the result is the same. 

 

I have already setup a storage bucket and believe i have the correct details setup.

 

Just cant catch a break on this one.

apathetichell
19 - Altair

so background -

 

1) I would posit that GCP is less python friendly than AWS and more Rust/Go/Node.Js friendly. Alteryx is only Python interms of these connectoins. 

2) GCP changed some of their credenitally requirements fairly recently (3-4 months) which could (could) have impacted this. I'm not fully sure.

 

3) can you turn on logs from your odbc driver -> and see what happens on write? If this is trying to write to a GCS bucket can you confirm that the GCP user has both list and write privileges for the bucket. I've seen many situations where a user/app is granted write/read but not list access and withouth list setup of the integration fails.

 

4) The "what I would do option" - I would script this entirely and run via command. I would do that because it would allow me to control what I'm sending and it would move the authentication from Alteryx/Alteryx tools to the end user which my company presumably controls/knows about. so like - 

gcloud storage cp OBJECT_LOCATION gs://DESTINATION_BUCKET_NAME
Labels
Top Solution Authors