Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Error Output Data: Unable to create backup - Access is denied. (5)

CGIQV
8 - Asteroid

I have scoured the forums, but failed to find a solution.

 

I currently have a workflow that dynamically names and creates over 300 files with data from a single input file.  The workflow then continues to write to these newly created files with additional sheets.  Each file will have between 5-7 sheets.  I have always used the CReW Macro Parallel Block Until Done to properly sequence the creation of sheets within the same file.

 

When creating these files on a network drive, I have ZERO errors.  Issue is, it takes over 30 minutes for these files to be created.

 

I decided to try locally to eliminate network latency.  It was lighting quick.  Except we have an issue:

 

*******

 

Output Data (508) Unable to create backup of C:\*****.xlsx to C:\*****.xlsx.bak: Access is denied. (5)

 

*******

This error did not arise until the third sheet was being created.  There was not have a consistent file that would trigger the error, but it was always early within the list of 300.  Once the error hits, the workflow completely aborts attempting to write any further files with that specific sheet and then moves on to the next sheet.  The same issue occurs again on the subsequent sheet where only a few files write the new sheet until the error pops.  This happens on all further sheets.

 

Here's what I've discovered:

 

- When the files & sheet1 are initially created, I face no issues.  I have no conflicts to avoid.

- When sheet 2 is created, a .bak for each file is created.  This is the first .bak file.  No conflicts, no errors.

- When sheet 3 is created there appears to be a conflict where another .bak is attempting to be created, but fails since a .bak already exists.  Now we have issues.

 

I've tried the legacy .xlsx drive, no luck, new errors instead.  I've tried to put in throttles to slow things down, didn't help.

 

The one solution I found was to execute a batch file that deletes all *.bak files after each sheet is written.  While tedious and inconvenient, it did work.  This allowed the workflow to clear all .bak files and allow the next sheet to create their own new one.

 

Any thoughts as to why this happens when executing this type of massive output workflow locally?  Any thoughts on a cleaner solution?

4 REPLIES 4
Qiu
20 - Arcturus
20 - Arcturus

@CGIQV 
I have similar experience when I was trying export more than 500 sheets in one Excel.
My work around is to export them by batch.

It appears to me this error is relating to some sort of memory limitation.

Qiu_0-1605939990865.png

 

CGIQV
8 - Asteroid

Interesting thought.  But as I've developed a workaround solution, I'm convinced this has to do with the .bak files.  If I'm able to clear them before the next sheet is created, it's not an issue.

CGIQV
8 - Asteroid

Another odd thing I have noticed is that I'm currently outputting to an extended path in My Documents folder.  

 

I decided to try outputting directly to C:\Alteryx.  For whatever reason, when the third sheet is being generated, rather than getting caught up on one of the first 10 sheets, it will make it most of the way through until I get my first error.  Sometimes the third sheet will completely write across all 300+ files without an error and I won't get my first error until sheet 4.

 

I thought maybe it could be some sort of limitation of path length, but results were inconsistent and did not agree with that hypothesis.

 

The only thing I've been able to do to completely resolve the issue is have a batch script run that deletes all .bak files after each sheet is written.  

elliephant
5 - Atom

Hi,

 

Are you able to share a prototype of the batch macro that deletes the .bak files after each sheet is written as I face the same issue?

 

Thanks

Labels