Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.
Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Execute Tools after Output succeeds

pablo_martin
6 - Meteoroid

Hello everyone,

 

We're facing a situation at my company and we are not sure if what we ask for is feasible.

 

CONTEXT

 

We are thinking about developing a custom execution log for a few Workflows that we have scheduled in the gallery. This would imply writing to a log table in our SQL Server after the normal output in the Workflow has finished successfully.

 

 

PROBLEM

We are unaware of how to make a flow continue after an output. I think the following picture presents the problem in a quite obvious way.

visual_description.PNG

 

 

  

Could someone give us a helping hand and show us any way to achieve the desired flow control requirement?

 

Best regards,

 

14 REPLIES 14
NickC
Alteryx Alumni (Retired)

Hello Pablo,

 

Try using the block until done tool.  This will ensure that the first output is completed before the 2nd and 3rd run.  

 

Nick

bharti_dalal
10 - Fireball

hi @pablo_martin,

once you have written data in output tool it is saved in you system. if you want to use that data further bring a input tool and upload the file. But why do need to write it in output tool everytime? just take the input from the tool preceeding the o/p tool  and move further.

pablo_martin
6 - Meteoroid

Hi,

 

Thanks for your answer. We are aware of the Block Until Done tool, but the main data flow and our logging dataflow are independent, and thus it makes no sense to put both of them together in the Block Until Done.

 

Somehow, we need to use the Flow control capability of that tool while having two separarte data flows that can't be joined together to enter the tool.

 

Any suggestions?

pablo_martin
6 - Meteoroid

Hi Bharti,

 

It is not the same data that should be written in both output tools. The workflows have two independent data streams, and what I would like to achieve is simply making sure that the second output is only written if the first one is successful.

 

Best regards,

NickC
Alteryx Alumni (Retired)

In that case i think you are best off using the Crew Macros.  You can run a conditional runner that would start a workflow if another one fails or is successful.  

 

You can download the tools from:

https://community.alteryx.com/t5/Engine-Works-Blog/Crew-Macro-Pack-2016-Q2-Release/ba-p/26482

 

Thanks,

Nick

pablo_martin
6 - Meteoroid

Hi NickC,

 

I was hoping to achieve this by using regular Alteryx tools, since having changes made in our company server will probably not be allowed by our policies. 

 

Anyway, thanks a lot for the helping hand.

 

Best regards,

cmbarnett
6 - Meteoroid

Did you ever come up with a working solution? I have this same problem and I've broken up steps into separate containers but I have to manually turn them on or off. In my workflow, I export data to an Alteryx database in one container. Once done, I close that container and open the 2nd container which uses that Alteryx database.

afv2688
16 - Nebula
16 - Nebula

Hello @pablo_martin,

 

If there is an error on the writing, alteryx will directly (on the gallery) stop the workflow and nothing will continue to be written.

 

On the other hand if you want to provoke a condition on you could do it with the message tool to throw an error if specific conditions are met.

 

Lastly, if you are talking about running on the desktop, there is on the configuration window, on the runtime pallete a condition that enables you to stop the workflow if there is an error, which combined with the message tool could achieve what you are looking for.

 

Sin título.png

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Regards

ak2018
8 - Asteroid

I know this is an old question... I had to write multiple datasets one after the other, I did use the Block Until Done tool but added a flag field using the Formula tool to identify where the flow should go and then before the output tool, use a filter to allow the rows that need to go in and a data cleanse tool to drop null rows and columns. If your fields are fixed, you can add a select tool to ensure no other columns come in. 

 

Alternatively, you can use the JOIN tool on the said flag fields and use the outer join of the data set you want to pass, that way the rows from the other data flow do not come in. I know this is a bit circuitous but it worked well for me.

Labels
Top Solution Authors