Alteryx Designer Ideas

Share your Designer product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!

basic flow-of-control macro / container type

Currently - in order to run a series of Alteryx processes which have start-finish dependancies, you have to hack this by putting each sub-process into batch macros with fake inbound and outbound controls and a fake control parameter.   Additionally, alteryx forgets the mapping of parameters if you move stuff around, and you have to re-link it all up again, running each step 1 by 1 until the next one fails.


some may say "just use Block until done"

- this only partially solves the issue if your dependency chain is one deep (e.g. create the table; and then send a summary of errors found in the table)

- this doesn't create any ability to encapsulate flows to create simplicity - so it drives the user towards having increasingly bigger and more complex canvases


If we could create a new macro-type or container-type which just allows start-to-finish dependency chaining like a sub-procedure, and which just passes on what was in the input stream directly to the output stream - this would materially improve the ability to simplify flows; and at the same time it's a cheap and easy way to allow for detailed dependency and flow control.


Happy to talk through this live with the team if that makes sense?


Thank you



5 - Atom

Hi  Sean,


I am working on a set of workflows which are sequential in nature and the requirement is to schedule them . 

I tried with block until done and run command but couldn't achieve the desired results . It would be really helpful if you could elaborate a bit more on this solution.

Thanks a lot and looking forward for help  . 


16 - Nebula
16 - Nebula

Hey @vn96966,


Sorry for the confusion, this thread is around the idea of Alteryx needing to support this capability (because it's tough right now).

From historical discussions - there seems to be 3 workarounds - but to help you get to a solution for your specific problem it's worth you posting it in the discussion thread under Data Prep & Blending (


3 workarounds I've heard of or used

a) Download the CREW macros - this contains a tool called "MacroRunner" I think that allows you to force sequence

b) you can wrap each piece of your flow in a Batch Macro - that will allow you to control the flow

c) finally - if you have the license for Alteryx that allows you to call alteryx from the command line, you can use a Run command to run another copy of Alteryx with your sub-flow


But what I'd say is that it's worth you posting a question in the discussion area - and the community can help with your particular challenge






6 - Meteoroid

The crew macros don't wait with executing/completing one flow before moving on to the next. Also the crew macros cause issues on the server, i.e. the crew macros are not the solution.


I see one of two options (or both):


a. Have 2 options in each tool:

  1. Number field which indicates the order of flow processing. The user would be able to edit this number on each tool. In case multiple flows have the same numbers Alteryx would process whichever can be processed first or process them both at the same time (as is happening now already).
  2. Tick box "Continue with next tool only when output of the tool has been completely done". Turning this on, Alteryx would not continue with the next tool, until the tool in question has done its job 100%.

b. Build an official real "Wait until done" tool. i.e. when the flow hits this tool, will stop any processing of following tools until all the previous tools are completed. 


Must note that the a. option is the better option, but also the more complex to build, I think.


FYI; I build a Python script to help me solve the problem of the dependent flows on the server starting to process before the previous flows are completed, but the server can't deal with it. 


It looks like I have to change my processes to accommodate Alteryx...

11 - Bolide

This can be handled another way (work around) though I fully support the idea(s) here.  the other way is to break the flow into several flows and have the predecessors write a trigger when done that the dependent flow or flows look for,  I have built some large flows with several processing paths starting with inputs.  this method brought efficiency and ease to the situation.


Scenario is 2 inputs read in data and does some processing and then are joined for final processing and reporting.  The 2 inputs are parallel processes but due to Alteryx lack of multi-path processing these will processes serially, about 2 hours processing each path,  and then once both finish the final leg runs and takes another 2 hours.  so 6 hours total.  


this example is both for insuring one set of logic completes before another starts but also for parallel processing.


By breaking the flow into 2 flows, one for each input path and one for final processing and by running the first 2 at the same time the end to end run time is now 4 hours, saving 2 hours.  But also we don't start flow 3 until 1 & 2 are done.  to automate this and other things I built my own scheduler, see Poorman's Scheduler (


But to keep it simple you can build one more flow which reads the trigger file written by each of the first 2 flows. If the trigger, what ever you decide to build, is there indicating the other 2 flows completed successfully then it can use one of several options to trigger the final flow.  Options are a CREW macro to run another macro or if you have "scheduler add on" for a Designer tor Server then run command tool or....


My scheduler keeps a database/list of flows and their required conditions for being run and the last status, inactive, running.completed good or basically reads the list of flows and reads the log files for each flow that was run since the last time it, the scheduler flow, ran, I had it set to run every 10 minutes.  It determined from the log files what flows completed and if successful or failed then evaluated the conditions for each flow in its list, including predecessor, successor relationships of other flows and if all condition were met it built a batch file to run the selected flow.  


running a flow can be done with run command tool, by batch, by CREW macro...or either of the first two forcing it to run in scheduler of Alteryx if you have server or Scheduler add-on license.



16 - Nebula
16 - Nebula

@fharper  - thank you for taking the time to write this up.    Essentially you've found a way to do a flow of control process using external semaphore type signals.


@AdamR ; @SteveA  ; @TanyaS ; @JPoz  wanted to include you on this thread so that you can see some of the creative ways that people work around the need to sequence flows that have to have end-to-start dependance control.

5 - Atom

Another vote for a simple way to sequence separate workflows on server. I am trying to update what is essentially two dimensions, and then the measure. Everything needs to run separately because the dimensions might be used for other measures. I could see this growing into something several layers deep, such that kicking off one workflow initiates a cascading sequence of workflows. 

16 - Nebula
16 - Nebula

If i had a dollar for every time i built a batch macro to handle the flow of events...