Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Batch macro failing and then skipping to next iteration

cgoodman3
14 - Magnetar
14 - Magnetar

Hi,

 

I've got a batch macro where the control parameter is updating with different years, with the actual macro pulling a list of all of the PDF documents on the webpage for that year, and then using a final download tool to pull back the PDF document.

 

The URL is structured as http://domain.com/public-filings?Yearid=2019, so the control parameter is updating with 2019, 2018, 2017 etc

 

However when running it, for each batch it will retrieve a list of about 30 pdf documents, but I have noticed that if in a particular batch (say 2017) a file doesn't download, it won't continue to download the other files within the 30, it just skips straight to the next value in the parameter i.e 2016. Is there any way of stopping this behaviour?

 

I've got a redacted version of the results pane below, as you can see there is an error in the 2017 files and it jumps onto 2016.

Workflow error example.png

 

It's also tricky to replicate the issue as it sometimes running it a second time it doesn't produce the error.

Chris
Check out my collaboration with fellow ACE Joshua Burkhow at AlterTricks.com
4 REPLIES 4
afv2688
16 - Nebula
16 - Nebula

Hello @cgoodman3 ,

 

The error appears sometimes when you are working with the same file, either that or it is really open by somebody of your organization. Try adding a wait until done before the writing, this may help solve the issue.

 

I would otherwise to have an image of your current workflow to get an idea and a possible solution.

 

Regards

cgoodman3
14 - Magnetar
14 - Magnetar

So I thought it might have been a read/write conflict with the files being written to Google Drive using Filestream, however I've just tried saving only to a local drive, but I've got the same issue.

 

Below is a screenshot of the batch macro as it stands.

Screenshot of workflow.PNG

 

 

Chris
Check out my collaboration with fellow ACE Joshua Burkhow at AlterTricks.com
cgoodman3
14 - Magnetar
14 - Magnetar

So with a slight modification and a separate conversation with @danfarmerTIL I've been able to get to a workable solution. My approach was to take the final download tool outside of the original batch macro and put it into its own batch macro with the addition of the wait a second tool. This slowed it down enough to prevent any r/w conflicts.

 

By looking through the results it appears from the website I was scraping that there were in fact two companies which appeared twice and had the same underlying document, thus causing the read/write conflict.

 

@danfarmerTIL's suggestion was to prefix a record ID to the filename so every file is unique.

 

A lesson I've learnt from this is that a batch macro will skip to the next iteration on error and not process the remaining records, so if there is a way of modifying this behaviour it would be great to know.

Chris
Check out my collaboration with fellow ACE Joshua Burkhow at AlterTricks.com
Robbobu1
7 - Meteor

@cgoodman3 

 

did you ever find an answer to this question "A lesson I've learnt from this is that a batch macro will skip to the next iteration on error and not process the remaining records, so if there is a way of modifying this behaviour it would be great to know."?

 

I have a group of excel files I am trying to read each quarter and occasionally i stumble across across a corrupt file that alteryx wont open.  Instead of just moving to the next iteration, it just stops the workflow.  I would like the rest of the records to complete and just skip the one corrupt file.  

 

Any ideas?

Thanks

Rob

 

Edit:  I was able to find an article that suggested to run the files through a dynamic input first.  I just grabbed the "List of sheet names" option so the schema would be identical.  This worked to weed out.the corrupt files.

Labels