Bring your best ideas to the AI Use Case Contest! Enter to win 40 hours of expert engineering support and bring your vision to life using the powerful combination of Alteryx + AI. Learn more now, or go straight to the submission form.
Start Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Writing Multiple Tabs from different workstreams threads

terrence10
8 - Asteroid

Hello,

 

I am trying to output to an excel file with multiple tabs.

Each excel tab is saved from a different alteryx workstream thread.

 

However, this runs into a problem where each thread tries to write the same excel file simultaneously - and then Alteryx throws a "cannot open file" error because file is in use

Error: Output Data (93): Unable to Open archive for zipping: *Filename* Can't open file: *Filename*: The file exists. (80)

 

See workflow screenshot attached - second thread is trying to write the excel file the same time as first thread (different tab) and errors out.

 

I know the block until done tool can usually be used (when tabs are being written from one thread)

    However, is there a way to do this with multiple threads? I put the block tool in my current workflow (doesn't work as expected) - but the idea is to write the top thread tab first, then the second thread, and so on

    Do I have to move the block until done tool early on in my workflow? Would like to avoid this as the beginning has two database pulls which I would like to happen simultaneously. 

AlteryxWorkflowWriteTabs.png

 

Is there a way to queue up these different "threads" to wait until the previous thread writes the excel tab, to start the next tab write at the writing step?

Thanks!

 

 

8 REPLIES 8
Gaurav_Dhama_
12 - Quasar

1. Add a block until done just before the first output,

2. add record count tool to the second anchor of block until done, append it to the second stream just before the second output.

3. Add select tool and drop this appended new column

 

and you will have the outputs running one after another.

terrence10
8 - Asteroid

Very cool! Thank you @Gaurav_Dhama_ !

 

It's not the prettiest in the canvas, but a clever solution!

    Adding a record count tool, then append the record count to the next thread, then dropping record count before writing! Very neat!

    I've expanded it to all 4 streams - had to use a second block tool to get 4 outputs

Attached screenshot of what this looks like

 

Thanks again!

AlteryxWorkflowWriteTabsv2.png

Gaurav_Dhama_
12 - Quasar

Perfectly done!

jrlindem
11 - Bolide

@Gaurav_Dhama_  I do this very similarly but [Sample]'ing 1 Row and then removing all of the fields in the subsequent append.  I like yours better since it reduces the number of fields I need to deselect.

 

Alternatively, I have been able to acheive good success using Control Containers to, for example:

  1. Create the file (c.container 1)
  2. Write to the first tab (c.container 2)
  3. Write to the next tab, etc. (c.container2...3...4...etc.)

Great solution though,, thanks for sharing!

sufregs
7 - Meteor

 @Gaurav_Dhama_ I have this set up but I still get the error "file is being used by another process". It seems to work fine till it gets to the directory tool. Any help in resolving this will be appreciated.

 

sufregs_0-1755896260623.png

 

Gaurav_Dhama_
12 - Quasar

By any chance you are reading the file that is being created?

Can you also check if this file is being backed up by one drive? 

sufregs
7 - Meteor

No. The files I'm reading isn't being created in this workflow.

I don't quite underdtand when you ask if it is being backed up OneDrive. I am connecting to the files on SharePoint via my OneDrive. This is the same way the other files are being connected as well.

Gaurav_Dhama_
12 - Quasar

So many times i have faced an issue where the moment alteryx writes an output to one-drive on local system that starts backing it up on server. Making the file unavailable.

You can give it a shot, but outputting this data to a temp location and see if you still get the same error.

Labels
Top Solution Authors