I am using a List Runner macro to run 3 consecutive workflows. The last/3rd workflow outputs a yxdb file that I access in my workflow that has the List Runner. I've added a Parallel Block Until Done and Dynamic Input in my List Runner workflow to avoid any overlap in writing and reading (I've attached a snippet of that workflow).
This workflow has been working perfectly fine with a subset of my data, but now that I'm running it on my entire dataset the output file is a lot larger. Since it was working before, I'm assuming this error has to do with the large file size.
I've also had people suggest putting a time block before reading the file, but this workflow takes over 10 hours to run...so I'm not sure that can be of use in my case.
Any idea how I can get around this? Should I possibly create a temp file? Thanks!
I had a similar problem writing out to different Excel sheets (different layouts) in one Excel file. Block until Done didn't help.
I used the attached macro, combined with the CReW Wait a Sec macro for each Excel tab. The macro just writes out a file, but the benefit is it has an output anchor, so I can link the output to a Wait a Sec macro before I write the next Sheet.
My Excel output looks like this:
Chris