Folks:
I am working in Alteryx Cloud Designer. I have a relatively simple workflow that reads a CSV file, selects a few fields, sorts them, removes duplicates, adds a surrogate key (row number) and then attempts to load this into a table in Google BigQuery.
When I run the workflow, everything seems to go well until the last step when I receive the following error:
BigQueryUnexpectedException: {"cause":{"cause":null,"message":"URI: tfs://my-workspace-name/113997/.trifacta/c730a878-bad6-4d22-a931-5519769e3561/2410144/.publish/table_c730a878-bad6-4d22-a931-5519769e3561.csv/data.csv is not a GCS URI"},"message":"Error while publishing to Bigquery: URI: tfs://my-workspace-name/113997/.trifacta/c730a878-bad6-4d22-a931-5519769e3561/2410144/.publish/table_c730a878-bad6-4d22-a931-5519769e3561.csv/data.csv is not a GCS URI"}
(I replaced my actual workspace name with 'my-workspace-name' above)
It is my understanding that the Google BigQuery driver first has to write the data to a CSV file on Google Cloud Storage and then read the CSV to load into BigQuery.
I have gone through different Alteryx documentation and none of what I find seems to coincide with the Alteryx Cloud Designer. For example, I see references to setting the temporary storage bucket name, but that seems to only be found in the desktop version of Designer.
Please let me know if you have any suggestions.
Thanks