Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

How to retrieve an auto incremented column result from SQL after create data with Alteryx

rnorton
5 - Atom

I am working on a workflow that queries data from one SQL database and maps it to another SQL database. 

 

The destination SQL database has a method of tracking uploaded data according to the tool that uploaded it using a simple handshake as follows:  

 

The tool - in this case Alteryx - must create a new record in a uploads table with a userId,  timestamp and string field containing the filename of the uploading code, e.g myWorkflowv1_1.yxmd. 

 

Once created the uploading tool needs then to retrieve the new record complete with an auto incremented number created by the SQL database and include this number on all records in the main data set to be uploaded.  This provides as simple audit tool  so that if there are any problems with the uploaded data it can be tracked back to the original tool being used to place it in the database and time it executed.

 

What is the best way to achieve this in Alteryx? 

 

There needs to be a sequential flow following the initial record creation by the workflow in the uploads table that once this create query is successfully completed the new record with the autoincremented number is retrieved by querying on exact timestamp, userId and script/workflow filename. 

 

I see the data output tool has a post SQL query, however it is noted this can't be used to retrieve data to be used in the work flow.

 

I can use an input data tool, to create a select query to fetch the new record, but don't know how to synchronise it so it only runs after the output data tool has successfully created the new record.

 

 

3 REPLIES 3
JagdeeshN
12 - Quasar
12 - Quasar

@rnorton ,

 

Have you tried out the Crew Macro Pack. It has the runner and the conditional runner tool which would let you implement this sequential execution.

 

Logically you will call the first workflow to write values into the config table, and on success another conditional runner macro will call the second workflow to read this value.

 

 

http://www.chaosreignswithin.com/p/macros.html

 

Hope this helps.

 

best,

Jagdeesh

rnorton
5 - Atom

Many thanks JagdeeshN, I'll give that a go.

rnorton
5 - Atom

Hi Jagdeesh,

 

After reading some comments on the Crew Macro Pack where users experienced issues with scheduling, and not feeling comfortable with creating an external dependency I managed to solve this by using a Block Until Done (BUD) statement in conjunction with a Dynamic Input Tool.

 

I understand other parts\inputs of the work flow may not be properly controlled by BUD but since they are pretty much guaranteed to always take longer to run I think this'll be adequate for now.  In either case if I get to a point where I am trying to upload data without a valid UploadID retrieved from the log table the main data set upload SQL will fail, no harm done.

 

Attached a screenshot of the workflow. 

 

Labels