I am using a List Runner macro to run 3 consecutive workflows. The last/3rd workflow outputs a yxdb file that I access in my workflow that has the List Runner. I've added a Parallel Block Until Done and Dynamic Input in my List Runner workflow to avoid any overlap in writing and reading (I've attached a snippet of that workflow).
This workflow has been working perfectly fine with a subset of my data, but now that I'm running it on my entire dataset the output file is a lot larger. Since it was working before, I'm assuming this error has to do with the large file size.
I've also had people suggest putting a time block before reading the file, but this workflow takes over 10 hours to run...so I'm not sure that can be of use in my case.
Any idea how I can get around this? Should I possibly create a temp file? Thanks!