This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I ahve create a workflow to connect to a JSON API, and get data, i have tested it retreiving 10 rows and it works fine, as soon as i ask it to get 100 records or 1000, after the API i have JSON parse, from that part onwards the API file size will be 30mb, but the parse and transformation goes up to 100gb in the 100 records instance or over 600gb in the 1000 records instance so i need to kill the process.
without seeing the worfklow and the API response it would be hard to diagnose, but one helpful tip would be to drop any unnecessary fields between the download tool and the JSON Parse. In most instances you would have both the URL you requested, as well as the JSON reponse and the returned message (successful, OK, failed, etc).
I'm guessing you won't need those, so just drop them. The JSON parse tool will repeat those fields for each record as it parses fields, so it's a ton of redundancy. If really needed, just have a record ID, and join it back to your data after doing the parse and transformation.
Depending on the URL size, other fields, this could help.
Attached is my workflow, i cant share the API link as this is private work related, i have attached a screen shot of the output. to read in 100 records and put them in the correct order it took over 10 minutes, which is unworkable, as daily i would be reading in 60,000+ records, which has 240 fields.