This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I'am using the v7 of the google analytics connector and it works great.
I know the google api has a limit og 10k records per request, and the Alteryx GA connector has a set iteration limit at 9999, which totals 1M records.
The thing is, that i do not want to compromise my current metrics/dimensions selections - and that is nearly generating 800k records a day.
Which means that I have to run the workflow once a day (Scheduling is currently not an option).
Is there a way to increase the iteration number of the Alteryx GA connector?
Would it be possible to make some sort of macro, that iteration the Alteryx GA Connector workflow, and change the date inside the connector module to in order to fetch a weeks or a months worth of data? - if this is the case - could somebody point me in the right direction.
Haven't tested this myself. But you can open up the Google Analytics Macro, search for the iterative macro (Tool318). Open that up and increase the max number of iterations within the interface designer. It is currently set to '999999'.
The 9999 iteration max is set because you can only send 10,000 requests per day, and according to the documentation there's no batching you can apply to decrease the amounts of requests:
Am I correct in thinking that your suggestion was to actually try and pull aggregated data so that you can make less calls? If so, I don't believe that you can do it within the GA tool itself, as the calls themselves won't do any different aggregations, but you may be able to build your own connector if the API does allow for aggregations. Or maybe you can build in a different customer metric on the GA side and iterate through those?
Sophia Fraticelli Premium Support Advisor Alteryx, Inc.
Interesting suggestion - perhaps, but then I'm a bit in over my head.
I'm positive that there is a way to pull more than 1m records through the api, as we have an old php service that can do that, however the it does not fetch the correct metric and dimension and also transforms it in the process.
I'll properly need to consult our developers in order to get the hang of it.