Hi Team,
Good Morning,
Based on the subject, i am able to find coulpe of links ,but unfortunately ,i am not able to resolve my issue.
Let me first give brief overview of my workflow connection.
1.My workflow contains database connection through Bitbucket.
2. This workflow contains 2 different databases
Oracle and Hadoop- Impala. So final output is having combine data coming from both the databases.
3. I also used json file ( that is also connected through bitbucket).
My impala database referesh data every 7th day of month and my workflow is scheduled to run on every 8th day of the month.
Now the issue is, if suppose hadoop- impala queries fails to run/ referesh ,how can i stop my workflow to run.( as it is scheduled to run automatically)
Ideally if any error come ,workflow suppose to stops immediately by using some flag,without inserting any single record in output source.
But since there are multiple columns coming from both the data sources.
I am not able to understand ,what will be the expected condition or flag ,that help me to immediately stop
the workflow.
I tried to used " Block untill done" ,count tool",test tool and "message tool".but not able to understand the specfic flag or condition.
- Ideally i want my workflow will start to run once all the specfic conditions/flag met otherwise it suppose to fali immediately.
I hope i am able to explain my issue properly.
Please let me know if additional information required.
Its kind of urgent request, i need to deliver the data asap.
Please advise.
Regards
Piyush Jain