This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Consider workflows 1, 2, 3 and 4. As part of a process I have configured the Workflow in such a way WF 3 runs once each time after 1 and 4 completes. Below is a sample,
Apparently, I have also results collected in a csv file that is being fed in as a input again in the same WF 3 for further processing. So the Loop runs and the WF 3 is able to complete the job as expected in the first run.
When This WF is called in after completion of WF 4, still it holds good, but its not able to generate the file nor read it successfully. I get the below error.
Error opening the file "PATH OF THE FILE LOCATION": The process cannot access the file because it is being used by another process.
What am trying to accomplish?
Ex: Am trying to output summaries, totals, other business calcs in seperate csv files for multiple WFs individually. These workflows are configured using list runners to run in an iterative fashion. when am trying to reuse WF 3 I suspect it gets into a deadlock situation and its not able to read the files as I believe its still actively read by the earlier process and getting into a deadlock situation. Is this valid ?
I cannot prepare dynamic files as I always want to keep the results static as it will involve managing the additional summary files
that wiil be generated for every run. Am trying to minimize operations of file keeping and use them for evaluation.
Any thoughts? it would be of great help.
Wish we had little sophistication on List runner to output results that one want to see.
Update: Altered original post to match the sample Image uploaded. Here, WF 3 is referred as Group 3 and 5 just for sake of calling in a order. (control precedence). Also I believe, that can easily make a copy of the WF 3 and keep different file names and use it, but the point is keeping WFs optimized and reusable.
It is not clear from your snapshot where the problem is. If two of your output tool are writing to a same file it might cause this issue. Try to combine both records and then use Until done to write one file at a time. Something like following
@Hannan Essentially, am loading data into the Redshift server and i have designed summary stats to tell me what are the processed records in a summary file with indicators. These multiple processes run in a logical flow and I want to read the summary files using the log parser to know my status.
So for example, say 30 record are generated and written to a file called summar.csv, I want the file to be read post the WF completion. what happens is, the loop is still active and not allowing me to read the file using the dynamic read. So I never know what records failed or a thing of that matter. am forced to sit an kick my workflows manually one by one.