I have a Macro solution to an issue I was facing: Outputting to an excel file from a workflow and then re-inputting that output in the same workflow.
The solution involved (read somewhere on the community) creating a batch macro that simply took the input file in. I connected this with a 'block until done' tool and hey presto it worked. However, there are 833 rows in the input file and the batch macro is looking at each row x 833! So i end up getting 693k rows once the macro is complete.
This is going to become a big issue as the original output file will get larger over time, thereby increasing the final output exponentially and significantly damaging performance.
The question is: How can I stop my batch macro from going through each row and instead just give me the whole data set?
Here is part of the workflow:
Workflow
Here is the Macro:
Macro