Hello all!
I have a workflow that checks if the data is present on two differents databases, and if they are I make an API Call (POST) sending this data as my request payload.
I have managed to build the whole workflow logic, including the data preparation, joins and the API call.
And that's my issue, since I'm dealing with a lot of data I keep getting the "413 - request entity to large" error in the download tool when I try to send the payload. After some research I found out I could use iterative macros to paginate the data and loop through my workflow results in order to send them little by little. But, I'm new to Alteryx and it's Macros and I'm struggling to figure out how to build this.
- Is it possible to call the macro passing the data that met the condition of being on both databases as a parameter?
- How could I set the macro to retrieve, lets say, 1000 records each loop?