Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Iterative Macros

Daniel_Figueredo
7 - Meteor

Hello all!

 

 

I have a workflow that checks if the data is present on two differents databases, and if they are I make an API Call (POST) sending this data as my request payload.

 

I have managed to build the whole workflow logic, including the data preparation, joins and the API call. 

 

And that's my issue, since I'm dealing with a lot of data I keep getting the "413 - request entity to large" error in the download tool when I try to send the payload. After some research I found out I could use iterative macros to paginate the data and loop through my workflow results in order to send them little by little. But, I'm new to Alteryx and it's Macros and I'm struggling to figure out how to build this.

 

- Is it possible to call the macro passing the data that met the condition of being on both databases as a parameter?

- How could I set the macro to retrieve, lets say, 1000 records each loop? 

3 REPLIES 3

Hi @Daniel_Figueredo 

 

Have you tried the Throttle tool? 

https://help.alteryx.com/2020.1/Throttle.htm

 

Daniel_Figueredo
7 - Meteor

Hello  @christine_assaad !

 

 

Sorry for the late response, I did try using the Throttle tool, but by the looks of it the download tool only starts running after the Throttle managed to send all the data through (print that follows). I tried placing the tool right before the download tool aswell, but since my data is a JSON it treats it as only one record, so I still get the 413.

 

DanielG
12 - Quasar

Following.

 

I am trying to do something similar with POST to API.  Sometimes it works and passes all the records through and sometimes it doesnt.... no idea why.

 

I have been developing a batch macro for this particular API by adding a tile tool before going into the macro and then passing the tiles in through the Control Parameter to see if that slows it down a bit.  However, I am not bundling all my items into one payload, I have about 2000 rows that I am trying to pass through one at a time each with it's own individual payload.

 

Frustrating to be honest, because it works and then it doesnt and when it doesnt there are no errors it just returns a smaller subset of data or none at all.  And I am using the same source dataset every single time.

 

I dont know enough about APIs or macros to know if this is an API issue or a macro issue though, so I will be keeping an eye on this thread for clues.  🙂

 

Good luck.

 

Labels