Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Execute Tools after Output succeeds

pablo_martin
メテオロイド

Hello everyone,

 

We're facing a situation at my company and we are not sure if what we ask for is feasible.

 

CONTEXT

 

We are thinking about developing a custom execution log for a few Workflows that we have scheduled in the gallery. This would imply writing to a log table in our SQL Server after the normal output in the Workflow has finished successfully.

 

 

PROBLEM

We are unaware of how to make a flow continue after an output. I think the following picture presents the problem in a quite obvious way.

visual_description.PNG

 

 

  

Could someone give us a helping hand and show us any way to achieve the desired flow control requirement?

 

Best regards,

 

14件の返信14
Alex98970
アトム

Hey Guys was this resolved. I am trying to implement similar kind of thing:

Once data is loaded into my table - > I want to run stored proc to populate other tables.

 

I am using Snowflake as the database

 

Mchappell67
コメット

Hi -

 

This solution came to my attention recently:

 

  1.      Turn each workflow into an analytic app, then follow the directions in the linked interactive lesson below for Chaining Analytic Apps: https://community.alteryx.com/t5/Interactive-Lessons/Chaining-Analytic-Apps/ta-p/243120
  2. ii.     To turn a workflow into an app, click on the Workflow tab of the configuration window, and click the radio button for Analytic App, and then save the workflow.

Perhaps that would work for you. 

Mark

 

 

Alex98970
アトム

Thanks Mark. What do you think of crew macros ?

Alex98970_0-1675038148939.png

 

 

Can this is used ?

Regards

 

 

 

Mchappell67
コメット

Hi - 

 

Unfortunately, I have not seen the Crew macros in action, yet, so not really in a position to comment.  The solution that I linked to above seems to be a simple way of linking workflows, but may not be powerful enough for you.

 

Mark

blyons
ボリード

I had a similar problem, because the database had an auto-incrementing ID which I needed for subsequent steps. Here is my solution, which does not use any Chaos Reigns macros:

blyons_4-1678901846455.png

 

The incoming stream has an "insert_timestamp" field. The Summary tool blocks downstream tools just like the Block Until Done, and gets the earliest timestamp so the Dynamic Input tool can modify the query to fetch just those records that were appended to the table in the previous Output tool.

blyons_2-1678901148590.png

 

It is also worth noting that, just like the Summary tool, the Block Until Done tool does not block until any branches prior to it complete. It only blocks until the incoming stream to that tool is complete. So, if the Output tool is still busy writing to the table, which can be true if large records are being written to a cloud database, the Dynamic Input tool won't see any data yet. So in order to allow for that, I added a delay in the Pre SQL Statement of the Dynamic Input tool:

blyons_3-1678901494919.png

(This command is specific to Snowflake. Other databases will differ.)

 

In the OP use case, since they don't want any of the fields from the previous stream, and just want to delay they secondary stream, you can use the Append Fields tool, appending the output of the Summary tool, and then just deselect the appended field.

 

I hope that helps others that may have this question.

ラベル