This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have a workflow that I've split into two tool containers. The first is an input to output that takes data from one source and loads it into another source (to update an existing table).
The second tool container then takes the data that was just loaded and executes some data prep. So the second part relies on the first part being run first as this process updates the data.
However, when I run the workflow, it looks like it runs the second part first so it doesn't take the updated data. Is there a way to control the order of execution such that I can run the first part first?
Alteryx always tries to process data in parallel whenever possible to take advantage of the all the processing power available on CPUs wiht multiple cores. Because of this, splitting a data stream in 2 does not guarantee that either of the 2 streams will complete before the other. There are a few ways that you can enforce the order of execution.
1. Ensure that the output of your 1st container is used as the input of your second. In your case , instead of writing out to a data file(source) in your 1st container and loading it in the second, just pipe the data as an input to the second.
2. Use a Block Until Done tool with container 1 on output 1 and container 2 on output 2.
3. Use a changed app with container 1 in the 1st app and container 2 in the second. This will ensure that container 1 is completely finished before container 2 starts.
I am going through some difficulties because of my lack of understanding of sequence of Alteryx tools execution. Is there a documentation which explains how Alteryx decides the sequence of tool execution?
@akasubi Below is the approach that you can take to control the execution sequence:
Create two separate workflows to logically divide the operation you want to perform. The first workflow will be the driving workflow whose successful execution triggers the other workflow. In your case, the Data loading process will be the driving workflow and data preparation part will be the dependant workflow.
Next, use the post-session command to execute a batch script that runs the second workflow. To do this -
1) Go to the workflow configuration window.
2) Select the Events tab.
3) Select the run command option in the drop-down menu under the Add button.