Alteryx Designer Knowledge Base

Definitive answers from Designer experts.
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!

Google BigQuery Tools



Google BigQuery ToolsBQ.png


You can download the BigQuery Connector Tools here


There are two ways to log into the Google BigQuery Connector.

  1. Service to Service:
    1. Service-to-service authentication requires a Google Service Account File, also known as a service account JSON key file. You can obtain a service account JSON key file from the Google Cloud Console or you can create a new key for an existing service account. More information about Google BigQuery can be found on the Google Big Query Documentation site under Creating and Managing Service Account Keys.
  2. End User:
    1. In Authentication mode, select End user.
    2. Enter your Client ID and Client Secret.
    3. Click Log in.
    4. In Sign in with Google, enter your Google email and password then click Allow to approve the connector's permission to access Google BigQuery on your behalf.
    5. Click Next.

After connecting, select the table you want to read from. You can also refresh the metadata by clicking on the refresh icon.


The next thing to look at is the 5 options:

  1. Limit the number of results returned
    • This is similar to the TOP or LIMIT clause used to limit how many records are returned
  2. Use a custom query
    • Offers more flexibility in selecting input data from BigQuery, however using a custom query disables the ability to “infer max repeated record limit” and to “limit the number of results returned”
    • User can use either Standard or Legacy SQL dialect
    • User must also select a project to query
    • Only Repeated record limit and max field limit are allowed
  3. Infer max repeated record limit
    • This will bring in the nested and repeated fieldsInside BigQuery TableInside BigQuery Table

      Returned nested rows inside AlteryxReturned nested rows inside Alteryx

  1. Repeated record limit
    • Use this if you do not want the max repeated record limit
  2. Max field length
    • Use this to select the data size coming in

Google Big Query Output Tool

  1. Similarly to the Google BigQuery Input tool, there are two ways to log in – Service to Service and End User. Please look above to see which authentication mode is right for you.
  2. Select the table you want to write you. You need an existing table to write to.
  3. Insert Batch Size
    • This determines the number of rows inserted into the BigQuery per API request.

Basic Troubleshooting Steps

Permissions Error Messages

  • 403 Invalid Permissions
    • Users will see 403 Invalid Permissions errors when attempting to write to a table for which they do not have update permissions
  • No Projects Available. Check your credentials and permissions.
    • This error is displayed when a user authenticates to Google but has no permissions for any BigQuery project


Known Limitations for the Google BigQuery Tools can be found here.


Outputting data from your designer workflow to Google BigQuery streams new rows to the table in BigQuery. Data will be held in a temporary streaming buffer for up to 90 minutes while processes in Google BigQuery convert the row-based data to columnar-based storage. In most cases, the data in the streaming buffer is still available for querying within seconds. However, you will not be able to see the inserted data in the Preview tab of the Google BigQuery console for the table until the extraction processes are complete. You can learn more about the streaming buffer in this Google Blog Post. More information about the availability of streamed data can be found on the Google BigQuery Documentation site.

5 - Atom

Hi DiganP,

I can't seem to download the bigquery tool from the link:!app/Google-BigQuery-Tools/5b1f092a8a93372a64c28c09


I get an error message "Problem Loading App". Is there something I am missing?





@marniedc Can you try deleting your cache from your browser? Did you also try multiple browsers (IE, Chrome)? It seems to work fine for me. 

7 - Meteor



@DiganP I just updated to the new tool version (1.0.3) and I get the following error:


Error: Google BigQuery Input (1): Traceback (most recent call last):
File "C:\Users\...\AppData\Roaming\Alteryx\Engine\../Tools\big_query_input_v1.0.3\big_query_input\", line 177, in pi_init
self.alteryx_engine.create_connect_metadata(self.n_tool_id, metadata_dict)
AttributeError: 'AlteryxEngine' object has no attribute 'create_connect_metadata'


I didn't have any problems with connecting with the old version of the tool but now I have no clue what happens..... 😞

7 - Meteor

I'm now also having problems with tables that have 'numeric' fields... (using version 1.0.0, since I am still not able to work with the newest tool version) I have to use custom sql and cast the numeric fields as strings. An clue if there is going to be an update to include this?


@FabianS When you install the tool, please run Designer as admin (right click > run as admin) and then install the tool. Try this in the newer version and let me know of the results with 'numeric' fields. 

7 - Meteor

@DiganP I upgraded to Alteryx Designer 2018.3 and everything is working fine with the 1.0.3 version of the tool (so to me it seems that it is incompatible with 2018.2).

Now the new tool is working fine, but when I try to select a numeric field it gives me "Google BigQuery Input (1) There was an unexpected error." If I do cast(_numeric field_ as string) then everything works perfect, except that only custom queries work....

Is this something that is not yet included??, because I understood that the numeric field is quite a new feature in Bigquery.

Alteryx Partner

I'm facing errors using the GBQ Output tool.  Is there a direct contact I can use to help ?


Basically I don't know what's wrong (first time using this tool), so I'm not sure if the problem is the field order, incompatible data types, etc..

Alteryx Partner

Maybe a simple problem but when Alteryx tries to write an Int16 field to a GBQ integer field it fails.  If you change the Alteryx field to Int64 it works perfectly.

7 - Meteor

@DiganP I'm having a similar error to @FabianS. Were you able to identify the issue, I'm not sure what the error means or how to fix it:


Error: Google BigQuery Output (2): Traceback (most recent call last):
File "C:\Users\...\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\big_query_output\", line 85, in ii_push_record
File "C:\Users\...\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\alteryx\", line 150, in display_error_msg
self.alteryx_engine.output_message(self.n_tool_id, Sdk.EngineMessageType.error, msg_string)
Boost.Python.ArgumentError: Python argument types in
AlteryxEngine.output_message(AlteryxEngine, int, EngineMessageType, list)
did not match C++ signature:
output_message(class SRC::Python::AlteryxEngine {lvalue}, int tool_id, int status, char const * __ptr64 message)





Alteryx Partner

Did you check that the datatypes in Alteryx match the datatypes in the GBQ table ?


What datatypes are you using in GBQ ? (some can be tricky to align in Alteryx)

6 - Meteoroid


Running Alteryx as ADMIN fixed my  issues.




I am running into an error when I try to install the connector.  Any help you could provide would be greatly appreciated.  Thanks2019-06-14_16-22-07.png.




Alteryx Alumni (Retired)



That looks like it worked.  Thanks for the help!

5 - Atom



Really hoping someone can help with the below error (I'm fairly new to Alteryx).


The below error code generates when I try to go from Alteryx to Google BigQuery Output (using this connector). I've set up the new table scheme in BigQuery the same as Alteryx, and tried to match up the data types as close as possible, but the below error code doesn't give me much to work on.


Error: Google BigQuery Output (2): Traceback (most recent call last):
File "\\mainline2016dc\Redirect$\\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\big_query_output\", line 74, in ii_push_record
row = self.schema_writer.build_row_from_record(in_record, self.parent.alteryx_engine)
File "\\mainline2016dc\Redirect$\\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\bigquery\", line 151, in build_row_from_record
field_val = alteryx_helpers.get_value_for_field(field, record_ref)
File "\\mainline2016dc\Redirect$\\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\alteryx\", line 39, in get_value_for_field
return accessor(record_ref)
File "\\mainline2016dc\Redirect$\\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\alteryx\", line 32, in <lambda> lambda ref: datetime.datetime.strptime(field.get_as_string(ref), DATE_FORMAT),
TypeError: strptime() argument 1 must be str, not None


Hoping this is a simple one and just me being new!


Thanks for any help

6 - Meteoroid

Hi Jason , 


I had the same issue at the start, I will try an narrow down how I got it working by asking the below:


which version Alteryx are you running?


Are all your python  libraries up too date?

 -Inside the big query connector zip file, there is a requirements text file.  Do you have all those installed?


Have you added the path of Python to your environment?


Which version python are you using?









5 - Atom



Apologies I'm definitely a noob in this regard.


Alteryx Version - 2019.2.5.62427


In terms of installation, it was a case of just running the base file that downloaded from the Alteryx Gallery, nothing specific selected when installing so it should have installed all included (took around 20 minutes).


How would I go about finding the python related information? As in terms of this, I haven't wrote any python myself when setting this up.


Is potentially my issue how I am trying to link the two? At present, it goes from a Calgary Input (loading my dataset into a Calgary Loader to boost the speed) to the Google BigQuery Output - is it potentially that I need something inbetween the two?



5 - Atom

Hi all,


Good news - I've managed to get the upload link working...


Bad news - the data duplicates for each time I refresh the workflow (e.g. my end workflow of data is just over 400k, I've ran the workflow twice today - around an hour each time - and now Big Query is saying over 800k records).


How is this fixed? Obviously just reuploading 400k records a time is not efficient, nor is it when it then just adds them all on the existing table, rather than just adding new rows.


Help - there seems to be no configuration around this.



5 - Atom

Thanks for developing the connector. Is there a way to specify whether I can 'append' instead of 'replace' the data exported to bigquery?


Alteryx Partner


I am trying to do a query via BigQuery connector and this is the error message I get....

Error: Google BigQuery Input (1): Traceback (most recent call last):
File "", line 4, in <module>
File "C:\Users\MainUser\AppData\Roaming\Alteryx\Engine\../Tools\big_query_input_v1.0.3\big_query_input\", line 1, in
from .engine import BigQueryInputEngine
File "C:\Users\MainUser\AppData\Roaming\Alteryx\Engine\../Tools\big_query_input_v1.0.3\big_query_input\", line 5, in
from connectors_core.bigquery import BigQuerySchemaReader
File "C:\Users\MainUser\AppData\Roaming\Alteryx\Engine\../Tools\big_query_input_v1.0.3\connectors_core\bigquery\", line 1, in
from .schemas import BigQuerySchemaReader
File "C:\Users\MainUser\AppData\Roaming\Alteryx\Engine\../Tools\big_query_input_v1.0.3\connectors_core\bigquery\", line 4, in
ModuleNotFoundError: No module named 'google'


Any idea what is wrong?


5 - Atom

This is the first time I am connecting to Google BigQuery Input through Alteryx. I used service to service authentication mode and was able to establish the connection and see list available table. When I select any table or use a custom query, I am keep getting "There was an unexpected error". I do not see any additional details around that error. I am using version 2019.4. 


Am I missing something? Why am I keep getting this error? Is there a different approach to use BigQuery tables in Alteryx.



11 - Bolide



Also hitting the same error:



Error: Google BigQuery Output (252): Traceback (most recent call last):
  File "C:\Users\Gavin\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\big_query_output\", line 74, in ii_push_record
    row = self.schema_writer.build_row_from_record(in_record, self.parent.alteryx_engine)
  File "C:\Users\Gavin\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\bigquery\", line 151, in build_row_from_record
    field_val = alteryx_helpers.get_value_for_field(field, record_ref)
  File "C:\Users\Gavin\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\alteryx\", line 39, in get_value_for_field
    return accessor(record_ref)
  File "C:\Users\Gavin\AppData\Roaming\Alteryx\Engine\../Tools\big_query_output_v1.0.3\connectors_core\alteryx\", line 32, in <lambda> lambda ref: datetime.datetime.strptime(field.get_as_string(ref), DATE_FORMAT),
TypeError: strptime() argument 1 must be str, not None




Diagnosed this to the Big Query Output tool not liking Date fields. Change these to string and it solves the issue. 


Also, in case you were not aware, the table has to be pre-defined, including the schema, which you can do via console. 




Ok, so after getting over my initial moan, and a bit of reading later, i have some further insight. 


Part of the limitation is that the tool is using the stream data option using the insertALL call, suitable if you where streaming event data, but not quite sure this is the best route for a connector into bigquery. 


Interestingly (and i need to delve into this a bit more...) it appears that the only other option is drop json files or other into some location then call a job to load these in.


more reading required, but i'll drop a post on my findings and keep updated as i go along. 


humpf... shame, but such is life...  







Hi guys,


I believe the Big Query connector is going through an update/fix at the moment. It is currently in Beta at the moment. You can sign up for beta and try the connector here!


@Ashwini_Bende The situation and error you are specifically describing sometimes occurs in proxy server environments (PAC or manual). We do have a fix for this, please do not hesitate to reach out to Alteryx support with questions. You may even mention this post for assistance, we are always happy to help. And @DiganP is correct! We will be releasing a new version of the BigQuery tool later this month that includes several important fixes and enhancements.


Thanks and have a great week,


Sr Customer Support Engineer - Connectors SME

11 - Bolide

@SabrinaCB @DiganP  - Awesome Thanks for the update. I'll sign up for the beta. 



5 - Atom

I'm hoping someone can help with an issue we are having using the BigQuery connector.  We are able to access BigQuery with the tool, but get the:

"Error: Google BigQuery Input (1): Schema exception: Field customDimensions has repeated record data type, which is not supported in Alteryx"


I know there have been some threads on flattening the records with a custom query, but I thought I also read on a link in a post on this site that version 2.0 of the tool solves for that?  Any help you can render is appreciated as this has us stalled.


Thank you,

5 - Atom

Here is a link to the other thread regarding a solution:


Thank you,

5 - Atom



i followed steps given in following link!app/Google-BigQuery-Tools/5b1f092a8a93372a64c28c09


downloaded Google BigQuery Tools v2.0.1 (June 9, 2020) to my machine 

During installation getting following  message for big_query_input and big_query_output


"no matching distribution found for azure-common==1.1.14(from -r requirements.txt[line1])


Alteryx designer version : 2019.4.8.22007


any help will be much appreciated 

6 - Meteoroid




Do you have it installed?


In Alteryx use the Python connector and type


!pip list


Then hit run


That will show you a list of python libraries isntalled.


Pip List.JPG




If its not there type


 !pip install Azure-common


Then Run



Good Luck 😀