Good afternoon, I would like to develop a workflow, in which it will execute the quantity of records in pieces, that is, that the fuzzy match execute 35,000 records at a time and store the results in a temporary file and after all the records are executed it would unify the data.
The intent is to run the analysis on more than 1 million records but run partitioned for server optimization.
Any idea to carry out this analysis?
Solved! Go to Solution.
Hi @rdusi001
The best option for passing records through your workflow in groups is a batch macro. See more info here:
https://www.youtube.com/watch?v=YIAbQGQ_Hkg
You'll need to add some steps at the start of your workflow to add a batch number to each block of 35,000 rows (you could do this with a multi-row formula) and then use this field as the control parameter.
Good luck!
I started to develop the batch macro but I was doubtful at this point, I used the mult-row formula tool but it is not running a looping.
This is formula than mult-row formula:
IF [RecordID] = 1
THEN 1
ELSEIF [RecordID] <= 10* [Row-1:CP]
THEN[Row-1:CP]
ELSE[Row-1:CP] +1
ENDIF
Hi @rdusi001
Instead of the multirow, you may want to consider something like this.
Each time the macro loops, the filter will be updated with the current value of the Control Parameter.
iteration 1 will process record 1 to 20, iteration 2 will process 21 to 40, iteration 3 will process 41 to 60, etc.
You should move your Record ID tool outside of the loop otherwise, you'll try to add a new RecordID with each iteration.
Dan