Folks:
I am writing what I think is a fairly basic workflow in Alteryx Designer Cloud. I am reading data from a CSV file, selecting a few fields, sorting and removing duplicates and then the goal is to insert this data into a Google BigQuery table.

I have configured a BigQuery Connection and the destination table exists in a BigQuery dataset.
When running the workflow, everything seems to go well until the last step. I receive the following error:
BigQueryUnexpectedException: {"cause":{"cause":null,"message":"URI: tfs://my-workspace-name-2025/113997/.trifacta/77cdec74-9da7-4573-ac03-729464baae2d/2394669/.publish/table_77cdec74-9da7-4573-ac03-729464baae2d.csv/data.csv is not a GCS URI"},
"message":"Error while publishing to Bigquery: URI: tfs://my-workspace-name-2025/113997/.trifacta/77cdec74-9da7-4573-ac03-729464baae2d/2394669/.publish/table_77cdec74-9da7-4573-ac03-729464baae2d.csv/data.csv is not a GCS URI"}
It is my understanding that when sending output to BigQuery, Alteryx will first write the data to a temporary file on storage, then that file is read back and imported into BigQuery.
From the error message, it appears that the temp file is written using the Trifacta storage (tfs://)
From some bits of documentation, I believe the Google BigQuery expects the temp data file to reside in Google Cloud Storage.
From other documentation, I read that I should specify the temporary file location in the output step and/or in the Connection configuration. However, I cannot locate such a field or option in either the Connection or in the Output step.
I also do not see the option for the "Google BigQuery Bulk Connection" in Alteryx Designer Cloud.
Any suggestions would be appreciated!
Thanks