Hi Team,
Good Morning,
Based on the subject, i am able to find coulpe of links ,but unfortunately ,i am not able to resolve my issue.
Let me first give brief overview of my workflow connection.
1.My workflow contains database connection through Bitbucket.
2. This workflow contains 2 different databases
Oracle and Hadoop- Impala. So final output is having combine data coming from both the databases.
3. I also used json file ( that is also connected through bitbucket).
My impala database referesh data every 7th day of month and my workflow is scheduled to run on every 8th day of the month.
Now the issue is, if suppose hadoop- impala queries fails to run/ referesh ,how can i stop my workflow to run.( as it is scheduled to run automatically)
Ideally if any error come ,workflow suppose to stops immediately by using some flag,without inserting any single record in output source.
But since there are multiple columns coming from both the data sources.
I am not able to understand ,what will be the expected condition or flag ,that help me to immediately stop
the workflow.
I tried to used " Block untill done" ,count tool",test tool and "message tool".but not able to understand the specfic flag or condition.
I hope i am able to explain my issue properly.
Please let me know if additional information required.
Its kind of urgent request, i need to deliver the data asap.
Please advise.
Regards
Piyush Jain
If you have an enterprise scheduling tool like Control M, you could use that to put a job in the Alteryx Server queue with the Alteryx Server APIs. What you are really looking for is a mechanism of determining whether or not a database load succeeded.
An alternative would be to have a workflow scheduled in Alteryx Server which serves as a check to see if the load succeeded with an API call at the end that kicks off the secondary workflow. This approach would be beneficial if you did not want to use anything external to check the success of the database load. That way your main workflow actually wouldn't be scheduled at all, but rather would be dependent on another workflow that checks the status of the load.
Thanks BrandonB,
I believe 2nd approach is more feasible.
I am trying to build the same. Since because of urgent deliverable,
Request you to please share some sample workflow like how to check when load succeeded with an API call and after that at the end that kicks off the secondary workflow.
Once Again Thanks,
Regards
Piyush
Checking whether or not a load succeeded is entirely dependent on what internal mechanisms you have to check. I'm not sure if you have a timestamp in your tables or something similar, but that is something you will need to figure out from the systems that you have.
As for kicking off another workflow from the first workflow, please take a look at this for reference: https://community.alteryx.com/t5/Engine-Works/Using-the-Alteryx-API-from-Alteryx/ba-p/318565
The logic would look like this:
Check timestamp or something similar within table and check against DateTimeToday() with a filter tool. If dates match and data flows through True, kick off API call using method above to kick off second workflow which you already have built. If data goes through false, it won't execute the end of the process and therefore the secondary workflow will not be executed.
Hi @PiyushJain
At the very minimum you should have Cancel Running Workflow on Error checked in your workflow settings
This will halt your workflow on any error.
This by itself isn't enough though. Because of the order in which the Alteryx Engine processes tools, it's possible for one branch of a workflow to make it all the way to the end and write an output before the next branch has a chance to error. You can use the following technique to make sure that all branches have completed before writing any output.
Alteryx has the concept blocking vs. non-blocking tools. Non-blocking tools pass the records through as they become available so downstream tools can proceed with partial data. The Formula tool is a good example of this. As soon as records become available on the input, the Formula tool performs the calculations on the existing records and passes these through to the next tool, with some batching to improve throughput. The next batch of records is processed and passed to the next tools as it arrives until all the records are processed. Blocking tools stop the records from passing through until all records are available at the input. Most of the Join tools, the Summarize and Count tools, as well as others are Blocking tools. They effectively stop your workflow from proceeding until all records have been processed. You can use this fact to implement error handling and output synchronization.
To stop the workflow if there are no records read from an input, use this technique
The Message tool throws an error if the output of the preceding Count tool is 0. The Append fields tool blocks the records from flowing through it until the Message tool has had a chance to throw the error, if required. The last Select tool removes the Count field from the data. Use this on all your inputs.
To ensure that no outputs are written until all of them are available and all Message tools have had a chance to throw errors, use this technique.
The Join tool is also a blocking tool. It's configured to Join by Record Position so that you only have one record as output. When coupled with the following Append Fields tools on each of the output branches, it ensures that none of the outputs can be written until the workflow has completely finished all the branches. If you have more than two outputs replace the Join with a Join Multiple and add an Append Fields tool on each of the output branches. Use this immediately before all your outputs.
You can find out which tools are blocking and many other details about the various tools by looking at the Periodic Table of Alteryx Tools.
Dan