I have scoured the forums, but failed to find a solution.
I currently have a workflow that dynamically names and creates over 300 files with data from a single input file. The workflow then continues to write to these newly created files with additional sheets. Each file will have between 5-7 sheets. I have always used the CReW Macro Parallel Block Until Done to properly sequence the creation of sheets within the same file.
When creating these files on a network drive, I have ZERO errors. Issue is, it takes over 30 minutes for these files to be created.
I decided to try locally to eliminate network latency. It was lighting quick. Except we have an issue:
*******
Output Data (508) Unable to create backup of C:\*****.xlsx to C:\*****.xlsx.bak: Access is denied. (5)
*******
This error did not arise until the third sheet was being created. There was not have a consistent file that would trigger the error, but it was always early within the list of 300. Once the error hits, the workflow completely aborts attempting to write any further files with that specific sheet and then moves on to the next sheet. The same issue occurs again on the subsequent sheet where only a few files write the new sheet until the error pops. This happens on all further sheets.
Here's what I've discovered:
- When the files & sheet1 are initially created, I face no issues. I have no conflicts to avoid.
- When sheet 2 is created, a .bak for each file is created. This is the first .bak file. No conflicts, no errors.
- When sheet 3 is created there appears to be a conflict where another .bak is attempting to be created, but fails since a .bak already exists. Now we have issues.
I've tried the legacy .xlsx drive, no luck, new errors instead. I've tried to put in throttles to slow things down, didn't help.
The one solution I found was to execute a batch file that deletes all *.bak files after each sheet is written. While tedious and inconvenient, it did work. This allowed the workflow to clear all .bak files and allow the next sheet to create their own new one.
Any thoughts as to why this happens when executing this type of massive output workflow locally? Any thoughts on a cleaner solution?