This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have a workflow that is handling too much data for my pc to handle. Given that the amount of data is too big for my pc too handle in 1 time, I basically split my data geographically into provinces and also into different years.
So the process is as follows:
I have 1 workflow that basically prepares an input file for the next workflow per year and per province.
In the next workflow i need to iterate over those files and run the flow separately on each one of them generating an output file per year and per province.
I turned both of those into batch macros and I call them from 2 workflows.
The first workflow calls the first batch macro by having 2 macro inputs but in the workflow the macro gives me 3 inputs. (group by)
The second workflow takes as input all the files in a specific folder and calls the 2nd batch macro by a control parameter and updates the input file location.
My problem is that none of the batch macros seem to work as intended. On top of that the second batch macro should in fact iterate over the years since the results of 1 year influence the results of the next year. Do I understand correctly that a batch macro gives you the possibility to run a workflow for each input without having to read everything into memory? Because that is exactly what I am trying to avoid. Or do I need an iterative macro in this case?