I have a workflow where I am fetching records from an API using Download tool.
I need to process this for 1 Million ID.
When I execute the workflow it works fine until download tool fetch data for IDs.
if there is any error for any single ID then download tool gives error and it end up stopping whole workflow.
I need a solution where we can ignore the errors and my workflow should be executed for next ID.
We tried using macro. I have handle the situation but macro takes 10 hours to process whole 1 Million ID.
Which is not feasible solution. Sometime we need to process for 4 millions also.
Any inputs from the team will save my hours of effort.
Hi @Jafar
There's no way to change the behaviour of the Download tool, so you might want to try a nested macro approach. Start by grouping your id's into batches of 10000. If the entire batch downloads then move on to the next batch. If there is an error, then start calling the ids in the batch one by one, Depending on the number of errors in your data this can increase the overall speed enormously.
You can also try to optimize this technique even further. As opposed to looping over all the ids individually for a bad batch, try dividing the batch size by 10 and download 1000 at a time, then 100, then 10, and finally 1 by 1. The should keep your batch sizes as large as possible and increase your throughput.
Dan
User | Count |
---|---|
19 | |
14 | |
13 | |
9 | |
8 |