This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
on 07-13-201801:16 PM - edited on 03-11-201909:04 AM by SydneyF
Google Big Query Tools
You can download the Big Query Connector Tools here.
There are two ways to log into the Google BigQuery Connector.
Service to Service:
Service-to-service authentication requires a Google Service Account File, also known as a service account JSON key file. You can obtain a service account JSON key file from the Google Cloud Console or you can create a new key for an existing service account. More information about Google BigQuery can be found on the Google Big Query Documentation site under Creating and Managing Service Account Keys.
In Authentication mode, select End user.
Enter your Client ID and Client Secret.
Click Log in.
In Sign in with Google, enter your Google email and password then click Allow to approve the connector's permission to access Google BigQuery on your behalf.
After connecting, select the table you want to read from. You can also refresh the metadata by clicking on the refresh icon.
The next thing to look at is the 5 options:
Limit the number of results returned
This is similar to the TOP or LIMIT clause used to limit how many records are returned
Use a custom query
Offers more flexibility in selecting input data from Big Query, however using a custom query disables the ability to “infer max repeated record limit” and to “limit the number of results returned”
User can use either Standard or Legacy SQL dialect
User must also select a project to query
Only Repeated record limit and max field limit are allowed
Infer max repeated record limit
This will bring in the nested and repeated fieldsInside BigQuery Table
Returned nested rows inside Alteryx
Repeated record limit
Use this if you do not want the max repeated record limit
Max field length
Use this to select the data size coming in
Google Big Query Output Tool
Similarly to the Google Big Query Input tool, there are two ways to log in – Service to Service and End User. Please look above to see which authentication mode is right for you.
Select the table you want to write you. You need an existing table to write to.
Insert Batch Size
This determines the number of rows inserted into the BigQuery per API request.
Basic Troubleshooting Steps
Permissions Error Messages
403 Invalid Permissions
Users will see 403 Invalid Permissions errors when attempting to write to a table for which they do not have update permissions
No Projects Available. Check your credentials and permissions.
This error is displayed when a user authenticates to Google but has no permissions for any BigQuery project
Known Limitations for the Google Big Query Tools can be found here.
Outputting data from your designer workflow to Google BigQuery streams new rows to the table in BigQuery. Data will be held in a temporary streaming buffer for up to 90 minutes while processes in Google BigQuery convert the row-based data to columnar-based storage. In most cases, the data in the streaming buffer is still available for querying within seconds. However, you will not be able to see the inserted data in the Preview tab of the Google BigQuery console for the table until the extraction processes are complete. You can learn more about the streaming buffer in this Google Blog Post. More information about the availability of streamed data can be found on the Google BigQuery Documentation site.