This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!
I have a workflow where I grab data from Google sheets, then I push those data into a BigQuery database. This is the full workflow. Grabbing the data takes 5 seconds. Writing the data takes 3 minutes. There is only 20 rows of data. In addition, every row written becomes a duplicate of the 1st row.
Insert into `SPRAY_BOOKING`.`TEST1`(`TIME`,`NATIONAL`,`SALES_TYPE`,`CUSTOMER`,`PLACEMENT`,`EMAIL`,`SITE`,`WEEK_BOOKED`,`DAYS_BOOKED`,`BOOKING_TYPE`,`MATERIAL`,`RESPONSILBE1`,`RESPONSIBLE2`,`YEAR`) Values (?,?,?,?,?,?,?,?,?,?,?,?,?,?)
@jensroy I would consider raising this more formally with email@example.com as it is probably something that most members of the community cannot replicate without creating a bigquery account and so on.
Hi @jensroy, where you able to find a solution to this? I have found that the new write to bigquery tool is also an append/insert only, and does not delete existing records. I'd love to hear whatever work around you came up with other than writing a view in BQ to only grab the most recently updated record (as this will be more costly, over time).
Hello. In the end i was not able to advance with this. The duplicate rows seems to be a result of the SIMBA driver not really supporting write to BigQuery. The only other solution i found in the community that could have worked was to use the download tool to post a CSV file into bigquery. But in the end I solved this project by using Azure instead of BigQuery. Good luck though, and please write back if you solve it. It would be good learning