This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
- Is it one API call or is each line being submitted separately?
- Are you using the download tool or calling manually using cURL etc?
The key to a solution here will be to identify the failing records, and so that will mean separate API Calls, with the need to re-run any API Call that failed. The actual process of re-running until complete can be achieved through an iterative Macro whereby the data keeps going to the iterative output unless it is successful.
What I have previously done on a site was wrap the calls in an iterative macro. I'll explain the situation that I had so that you can see if it fits.
I was calling an API and if my call would result in more than 500 rows, an error would return and so I then had to split that call up into multiple API calls and re-run. The advantage of the iterative macro was that if the process failed, then that iteration would fail, but the next iteration would then kick off. I advise that if a API Call fails, add it back to the bottom of your iteration calls so that it tries again when after everything else has tried.
The actual iterative macro is pretty simple, you just split the first API call off and process that, the rest then goees to the iterative output. The same can also be done with a batch macro on RecordID.
The tool does output to he results window when the tool fails to connect, however there is no way of getting that message in real-time into the workflow to setup a re-try iterative macro.
All other solutions invovled writing a log file, then after run, reading the log file to identify which records haven't connected and re-running. Not ideal and breaks down if you are pulling from multiple tools in one flow.