Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Control Order of Execution of Workflow Objects (Container)

Hi 

Wanted to control the order of execution of objects in Alteryx WF but right now we have ONLY block until done which is not right choice for so many cases 

Can we have a container (say Sequence Container) and put piece of logic in each container and have control by connecting each container?
Hope this way we can control the execution order
It may be something looks like below 


63 Comments
BenG
Alteryx Alumni (Retired)

We do get many requests for this kind of functionality.  Could you describe a couple of scenarios where this would be helpful for you?

Thanks,

Ben

s_pichaipillai
12 - Quasar
Ben,
Thanks for responding !!!
Let's take the below Scenario
I need to build a Data mart for my customer and it’s kinda keeping history in DW
So need to process from different system such as files and database
So I need to bring the data from all these systems to a staging area (Tables), then load my dimensions and finally fact tables then grab the necessary data set to my tableau dashboards
 
Think about the above data flow and if we have control on this flow, so that I can run them in sequential which will not execute the dimension and fact load before I load the data into staging
So I can run them in the below order
 
1.            My all source files will get loaded into staging first
2.            Once staging is done then I can load my all dimensions
3.            After all dimensions loaded then I can load my fact table as dimensions and fact table have Primary and foreign key relation ships
4.            After my data mart loaded successfully then I can grab the data set for tableau dashboard and refresh them
5.            Hope this helps to understand better about this idea :)
 
 
anthony
11 - Bolide
I also have sequential needs when I make API calls - I need to get some data, process and then use that data for next call.

Since I dont know the sequence in which alteryx runs, I have been hacking my way around with block unti done and hope it works.

Being able to build our workflows and have them execute exactly in the order we expect would be a nice 'feature'

Thanks
Anthony
rob_cottrell
7 - Meteor
Here are some other examples:
  • A process outputs several files, then uses RunCommand to ZIP them together.  All of the outputs need to be completed successfully before the RunCommand tool executes.
  • A workflow outputs a script file (.txt).  Then it outputs a batch file (.bat) that runs a 3rd-party SFTP program, which reads the script file.  Then it uses a RunCommand tool to execute the batch file.  The script and batch file must be completely written before the RunCommand tool executes.
Both of these are real-world examples I use today.

 
BenG
Alteryx Alumni (Retired)
Thanks for the additional feedback.  This helps with planning and discussing the specific details of what we need for something like this.

Ben
DanH
Moderator
Moderator

There is currently an additional option beyond Block Until Done, but it's not obvious.

 

Batch macros guarantee the contents of an iteration are complete before presenting a complete status back to the partent workflow. So any process you need to guarantee completeness (including processes that include only terminated streams, such as Output or Render Tools) can be placed in a batch macro and the outgoing macro stream will be 100% complete, even without a Block Until Done after the batch macro.

 

To set up this batch macro, consider that:

  • You likely will only need one iteration of the batch macro, in which case you could just append a temporary id of "1" to use as your Control Parameter in the incoming data stream,
  • If your process has no output stream (as with Output Data tool), your macro output still needs an output stream. In this case, artificially provide an output with a Text Input going to a Macro Output.

In my experience, this is the most bullet-proof method to explicitly control the order of operations for complex workflows.

s_pichaipillai
12 - Quasar

Dan,

thanks . Glad to see that there is an workaround already

could you please post if you have a sample, i would love to use it :-)

BenG
Alteryx Alumni (Retired)
Status changed to: Under Review
 
pmo511
5 - Atom

Use case:

 

Sequence 1. Load Dimension tables

Sequence 2. Load Fact tables (that need to run after Dimension tables so surrogate keys can be locked up and inserted into facts.

Matt_L
5 - Atom

Another workaround that I use is the R tool. If you're using a postgres database this library allows you to connect; https://cran.r-project.org/web/packages/RPostgreSQL/index.html. You have full control over the order that the data is extracted through the R tool based on how you set up the code. 

 

require("RPostgreSQL")

pw <- {
  "your password here"
}

# loads the PostgreSQL driver
drv <- dbDriver("PostgreSQL")
# creates a connection to the postgres database
# note that "con" will be used later in each connection to the database
con <- dbConnect(drv, dbname = "your db name here",
                 host = "your host here", port = 42,
                 user = "your name here", password = pw)

data <- read.Alteryx("#1",mode="data.frame") ## Read data into R from connection 1 in the Alteryx workflow

dbSendQuery(con, "SET search_path TO your schema here (may not need this);")

dbWriteTable(con, "alteryx_table", data, overwrite = TRUE) ## write a table to your db from Alteryx 

df_postgres <- dbGetQuery(con, "SELECT * FROM your table name;") ## pull a table from your db to Alteryx

df_postgres[] <- lapply(df_postgres, as.character) ## got errors in the next line without this

write.Alteryx(df_postgres, 1) ## pass table into alteryx workflow