Hello everyone,
We're facing a situation at my company and we are not sure if what we ask for is feasible.
CONTEXT
We are thinking about developing a custom execution log for a few Workflows that we have scheduled in the gallery. This would imply writing to a log table in our SQL Server after the normal output in the Workflow has finished successfully.
PROBLEM
We are unaware of how to make a flow continue after an output. I think the following picture presents the problem in a quite obvious way.
Could someone give us a helping hand and show us any way to achieve the desired flow control requirement?
Best regards,
Hey Guys was this resolved. I am trying to implement similar kind of thing:
Once data is loaded into my table - > I want to run stored proc to populate other tables.
I am using Snowflake as the database
Hi -
This solution came to my attention recently:
Perhaps that would work for you.
Mark
Thanks Mark. What do you think of crew macros ?
Can this is used ?
Regards
Hi -
Unfortunately, I have not seen the Crew macros in action, yet, so not really in a position to comment. The solution that I linked to above seems to be a simple way of linking workflows, but may not be powerful enough for you.
Mark
I had a similar problem, because the database had an auto-incrementing ID which I needed for subsequent steps. Here is my solution, which does not use any Chaos Reigns macros:
The incoming stream has an "insert_timestamp" field. The Summary tool blocks downstream tools just like the Block Until Done, and gets the earliest timestamp so the Dynamic Input tool can modify the query to fetch just those records that were appended to the table in the previous Output tool.
It is also worth noting that, just like the Summary tool, the Block Until Done tool does not block until any branches prior to it complete. It only blocks until the incoming stream to that tool is complete. So, if the Output tool is still busy writing to the table, which can be true if large records are being written to a cloud database, the Dynamic Input tool won't see any data yet. So in order to allow for that, I added a delay in the Pre SQL Statement of the Dynamic Input tool:
(This command is specific to Snowflake. Other databases will differ.)
In the OP use case, since they don't want any of the fields from the previous stream, and just want to delay they secondary stream, you can use the Append Fields tool, appending the output of the Summary tool, and then just deselect the appended field.
I hope that helps others that may have this question.
User | Count |
---|---|
17 | |
15 | |
15 | |
8 | |
6 |