This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I've been working with BigQuery using the Google BigQuery input tool, and I've noticed that any time I have to execute a query that has a sizable data return, it takes a very long time to stream that data from Google to BQ. I'm assuming it's because it is a stream, but wanted to see if anyone else happened to see this behavior or if anyone knew of a way to speed it up.
Using the Insert Batch size parameter may help.
I'd recommend doubling it and then halving it from your current setting and then going from there as to which is best. AKA half again or double again, until you find the optimum for your data/network.
I'm pretty new to the Google BigQuery connector, doubling what exactly?
Sorry, I got confused and thought you were using the output tool.
I am experiencing the same issue. I did a comparison of Designer (2019.4) vs Tableau Desktop. Using a custom sql to narrow down the data set, it takes Alteryx roughly 100x longer than running the same query in Tableau desktop.
Any feedback is appreciated.
In the solutions I was building, I actually had to switch to the Google BigQuery ODBC driver provided by Simba. It's a little faster (albeit still not ideal). I did hit some bumps in it though. Alteryx doesn't appear to correctly translate data types in every case and I had to explicitly set a few numeric ones with CAST when pulling from BQ. It's worth noting, that when I had the data type issues, I opened a case with support and they told me that the only way it was supported was to use the BQ tool, even after I showed them it wasn't feasible.
Experiencing a similar issue with download speeds. Haven't found an adequate solution yet.