Hello All, I have a flow that reads a hyper file, and overwrite it with some values (1 on the print) . Then it appends some other values (2 on the print).
On 1 I have 0 records to be outputed. So the hyper file should be 0 records.
On 2 I have 1,9 MI records to be outputed. So the hyper file should be 1,9 MI.
I've set up the Parallel Block Until Done and it seems to work fine (Print 2)
But when the flow is finished I can see on the log that output 2 was run first, so my final file has 0 records. (Print 3).
Can I add something to force the flow to output 1 before 2?
Thanks!
Solved! Go to Solution.
My main question is why you have set it up in this way? There maybe a logical reason but I’m struggling to think of it?
why would you not use a union tool and write both streams of data together to a single output?
Ben
What happens on this flow is:
A monthly excel file is received and appended to a hyper file (so we can have historical data).
Sometimes we have reprocess of the files. For example lets say the hyper has Jan-May. And then I receive a new May file. When I run the flow everything on hyper from may should be erased and then the new data appended.
For some weird reason I didn't thought about the union tool, but it will work perfectely.
Thanks!!!
Hi Fabian,
Have you got the AMP engine enabled? It could be that multiple cores are trying to execute the separate outputs of the workstreams at the same time!
Cheers,
Chris
This Parallel Block Until done is not working as expected for me. I tried setting the "Use Amp Engine" option, and that only seemed to create more problems for me by corrupting my file.
User | Count |
---|---|
19 | |
14 | |
13 | |
9 | |
8 |