Hi @Morten ,
I'm not aware of a direct solution for that.
What I've already done was, to create in the pre and post SQL statement, a table with 2 columns, start (preSQL) and finish (postSQL), that way, if the finish is null I'm sure it is a table that is being currently updated.
You have the option to create a counter to wait or to simply stop the process.
It is not a pretty solution, but it does work.
Best,
Fernando Vizcaino
Hi Fernando Vizcaino,
Thank you for sharing you solution. I actually worked with the same idea, but went with something else, as it requires too much configuration for every time you need to input or output data.
Instead I use the in-db tools:
First I use a data stream in tool create a temp table.
Then i bulk upload to the table using write data in-db
This prevents dirty reads and forces a native error, which I can use with a try-catch list-runner I made. However, It doesn't support pre and post sql statement, Update: insert if new and data is not sorted.
So still looking for a better solution..
Thank you for sharing your solution, it is always nice to know, that other people thought something similar. So really appreciate it.
Best regards,
Morten
I came across this same situation. Here was my solution.
1) When writing my data to a table in Workflow 1, I include a column with TotalRowCount, so each row has how many records I'm writing.
2) When pulled data from the table in Workflow 2, I include in the WHERE clause language such as: WHERE Table1.TotalRowCount = (SELECT Count(*) FROM Table1)
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |