This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Do you use Alteryx in a language other than English? If so, we want to hear from you! Please help us improve the international experience of our products by participating in this 5 minute survey.
We are updating the requirements for Community registration. As of 7/21/21 all users will be required to register a phone number with their My Alteryx accounts. If you have already registered, you will be prompted on your next login to add your phone number.
Case: I want to prevent reading from a database table that is being written to.
Problem: let’s say I have two workflows. Workflow 1 uses the regular output tool in Alteryx to write 1 million records to a table. Workflow 2 then starts reading from the same table that workflow 1 is writing to and this results in a dirty read –workflow 2only reads the number of records that were transmitted at the time of reading.
The way I understand the output tool is that it chunks up the writing session into smaller transactions (specified by the user) and finishes once the writing session completes. However, it doesn’t lock the table while writing and therefore dirty reads can happen if workflow 2 reads meanwhile workflow 1 is writing.
The situation occurs when workflows on the Alteryx server runs simultaneously.
In many cases workflows are executed in an extract, transform, load order to avoid concurrency problems. However, this is not possible as workflows need to be run at different schedules which have overlapping runtime.
DB runs on MSSQL
Changing the isolation level on the database, but Alteryx ignore these.
Using locks in a pre-SQL statement, but they release right before the writing sessions start.
The SQL Server bulk loader protocol, but dirty reads still occur.
What I've already done was, to create in the pre and post SQL statement, a table with 2 columns, start (preSQL) and finish (postSQL), that way, if the finish is null I'm sure it is a table that is being currently updated.
You have the option to create a counter to wait or to simply stop the process.
Thank you for sharing you solution. I actually worked with the same idea, but went with something else, as it requires too much configuration for every time you need to input or output data.
Instead I use the in-db tools:
First I use a data stream in tool create a temp table.
Then i bulk upload to the table using write data in-db
This prevents dirty reads and forces a native error, which I can use with a try-catch list-runner I made. However, It doesn't support pre and post sql statement, Update: insert if new and data is not sorted.
So still looking for a better solution..
Thank you for sharing your solution, it is always nice to know, that other people thought something similar. So really appreciate it.