hi folks
i have a workflow that collects a sizeable volume of data (c800k rows) from a database view, currently i read the data into the workflow once at the start. i also send that data to a macro later on in the workflow
within the workflow the data is processed successfully and output created as expected. in a separate stream i send the data to a macro for processing and output, again discrete from the main macro
we understand the data volume will grow significantly (perhaps 10x) "soon" and have concerns about speed / efficiency
can anyone shed light on whether i should continue to send all of the data across input/ouput tools to the macro OR read the data for a second time within the macro itself
thanks, as always
ianjonna
Solved! Go to Solution.
Hi @ianjohnston
Since you are connecting to a database for your data, have you tried using the In-DB Tools?
The In-DB tools will allow you to process the data directly within the database without the data having to travel over the network.
This would also mean that you are limited to some of the In-DB Tools only, however, once you get to a point where you may want to use the other tools, you can use the data stream out/data stream in tools to be able to build the rest of the workflow before outputting to the DB.
This would help with the speed and performance of the workflow.
I would recommend reading the data in for the macro itself.
If you can read in the data, and then have the first output/macro (second output) within the same workflow, then that would be the better option.
You can also use a block until done tool to make sure the first output is generated before the macro data is processed.
Hope this helps.
User | Count |
---|---|
19 | |
15 | |
15 | |
9 | |
8 |