Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!

Alteryx Server Discussions

Find answers, ask questions, and share expertise about Alteryx Server.

chaining multiple workflows and detecting data updates

psyberbrain
6 - Meteoroid

Hi Alteryx community,

 

I would like to know if there is a way / best practice to detect a.) the presence of a new file and b.) chain multiple Alteryx workflows one after the other based on the running of a previous workflows.

 

My use case is the following:

IT run an automated DB extract roughly every 2h creating a csv file that is to be consumed.

The file is very large and delivery can fail / be delayed on IT side.

I would like to process the file asap when it gets delivered to maintain an up-to-date view of my data.

The first round of processing is very compute intensive and can take over 1h on the gallery server.

Therefore this step should only be processed if there has been an update to the DB extract file.

 

After a successful run, further workflows should be triggered, processing that data in different ways.

Further data processing steps should be independent of each other (one failing should only impact downstream workflows but not prevent parallel workflows from running)

Further workflows should be chained bottom-up so different teams can build solutions off the previous processing steps independently.

The initial up stream processing workflow should be agnostic of downstream processing so there is no need to touch the code of the original workflow (The chaining functionality in Alteryx does not meet the requirement as it needs to chain the following workflows top down).

 

The current solution, is to write a log file when the initial file has been last processed.

Then check the DB output file when it was last modified & process if it was not processed before.

This Alteryx check runs on the gallery every 10 mins & generates a lot of overhead & is absolutely not scalable to multiple processing steps without creating a lot of server load.

 

So my questions are, is there a lightweight way to check if a file has been created/modified on a shared drive & trigger an workflow based on that. If so, that would solve both issues stated above (we would chain the downstream workflows based on output file of initial workflow).

If there is no such way, is there the possibility to have a 'listener' on workflow A that signals when A has executed and workflows B,C and D get triggered automatically?

 

Open to any other ideas and suggestions & thanks in advance for your thoughts.

 

Cheers

 

1 REPLY 1
apathetichell
18 - Pollux

what's your server/cloud/db combo? can you use something like cloud function/lambda after your file is written to trigger your Alteryx workflow via api?