Hi,
I am using the download tool to retrieve data via API by providing a url. This works fine.
The API does not allow to retrieve several sets of data at once which means I have around 50 different urls.
I do not wannt to manually create 50 download tools, so I was trying to use a list of all urls and "loop through it". I was trying to use the iterative macro but could not finish successfully. Can you help me? :)
Workflow is very simple: Just a list of urls and a download tool that uses the url as input.
Thanks!!
Hi @Rico_Widmayer ,
I need more information to ascertain exactly what you're doing but there are a number of approaches to this.
1. Use the primary URL to go to the page which contains the sub-urls. You can then scrape this HTML and parse out the URLs on that page. You then feed these into a subsequent download tool which will then go and loop through each one.
2. If you are downloading data from Page 1 then Page 2 etc, you can determine the urls required simply by checking to see if there's a page= on the url string. You can then simply use the Generate Rows tool to generate the number required then append them to the base url which has page=. This will then create all required urls and you can feed these into a second download tool.
I hope this helps,
M.