Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Optimal Workflow Size

Winston
7 - Meteor

I'm working on a project that is turning into a rather large workflow file.  The problem I'm having is that each time I either add a new tool, modify an existing tool, or connect two tools, my system pauses, Designer says its not responding, then after 5-10 seconds Designer comes back to life and I can continue on.  This cycle continues anytime I try to do anything and is very annoying.

If I'm working in a small workflow, I have no response issues.

 

So I'm wondering is there a practical limit for the size of a workflow before it becomes more of a pain to work on than having several small workflows?

 

As there are still plenty of process steps to add to the workflow, I'm looking for suggestions how I can make a workflow perform well at design time.

 

Below is what the workflow looks like.

 

Many thanks in advance!

 

 

Winston_0-1620397377608.png

Winston_1-1620397415107.png

Winston_2-1620397426756.png

Winston_3-1620397437899.png

 

5 REPLIES 5
TheOC
15 - Aurora
15 - Aurora

hi @Winston 

That really is a hefty workflow!

Apart from the obvious (making sure to use batch/iterative macro for repeated tasks, optimising logic etc),
I would suggest enabling "Disable Auto Configure" in the advanced settings. This will make the workflow much more responsive when adding tools, as it removes a lot of the lag caused by many tools on one workflow. Just remember to turn this back off when you stop working on the workflow and you want to run it:

TheOC_0-1620399038343.png



Hope this helps!
TheOC


Bulien
danilang
19 - Altair
19 - Altair

Hi @Winston 

 

As well @TheOC's suggestions, look at taking repeating parts of your workflow and either

 

1. add grouping logic within the various tools 

2. move the repeating part to a standard macro and include it multiple times

 

danilang_0-1620477573712.png

 

For example, the image above appears to have 4 identical sections in it.   From the detail in the image it's difficult to tell if they are indeed identical, but if they are, ask yourself why this section is repeated 4 times.  Is it because you're applying the same transformation to 4 datasets that differ only by a key field, i.e. CompanyID?  If so use grouping operations, i.e. group by CompanyID in Summarize, Transpose, MultiRow tools, etc.  Add the CompanyID as key in Join tools.  Convert any Append Fields to Joins with CompanyID as the join field.   

 

If you can't add groping logic, then convert the repeating part to a standard macro and include it 4 times in the main workflow.  this will also make maintenance easier since you'll only have one place to to make any changes  

 

BTW.  Disabling AutoConfigure means that the metadata isn't automatically refreshed when you make changes to tools.  This can cause issues if you add or remove fields since the new fields don't show up further down stream.  You can press F5 at any time to manually force a Metadata refresh to see the effects of any upstream changes.

 

Dan 

TheOC
15 - Aurora
15 - Aurora

great shout @danilang!

I couldn't see that much from the images, your eyesight is better than mine!

And I also didn't know you could force a Metadata refresh with f5, that's really useful to know thanks for sharing that insight!


Bulien
Winston
7 - Meteor

@TheOC thanks for the suggestion, it makes adding in new tools much faster but as @danilang pointed out, it doesn't populate metadata downstream until you run the flow again.  As the flow takes a while to run its quicker to wait for the tools to slowly update than to have to run the flow after adding new tools.  I will have to try using F5 to how that works time wise.

 

@danilang You are correct that there are several parts that are repetitive and are coming from different sources for matching.  Because of that the data from each source is different and thus requires different data manipulation before being put through fuzzy match testing.

 

I am working on a Matching Macro that takes data in as generic field names and then its just a matter of what fields I assign to the macro inputs.  Unfortunately, this will only replace my fuzzy match tools  so not a lot of savings other than no longer having to figure out why one match tool works but the next one doesn't.

 

Thanks for the suggestions!

danilang
19 - Altair
19 - Altair

@Winston 

 

You can also look at caching the data.  If you're making modifications at the end of your workflow, you might be able to cache the data at one of the points before. This might also speed up the metadata refresh, though I'm not sure about that. Check out this article for the conditions where you can and can't cache. 

 

Dan

Labels