I am creating an app that allows users to make changes to a database table using the "Data Stream In" tool with "Overwrite Table" as the creation mode:
Before overwriting the table, I perform several data validation steps on their changes. If there is an issue with their changes, I create an error using the message tool:
I also have added a "Block Until Done" tool and selected "Cancel Running Workflow on Error" in the Runtime settings. Even with these settings, the "Data Stream In" tool always executes the following command "Executing PreSQL: "DROP TABLE table_name" despite my attempts to stop my workflow before this step. Does anyone have any advice to keep my table from being dropped if the new data does not pass my data validation steps?
@phottovy could you attach your workflow here for further analysis?
@RishiKI did my best to put together an example workflow with fake data. Unfortunately, I can't use an actual In-DB connection string. As I mentioned in my first comment, I normally have "Cancel Running Workflow on Error" selected but I left it unchecked in the attached example so you can see all the possible error messages. Despite my attempts to stop the workflow early, the Data Stream In tool always performs the Executing PreSQL: "DROP TABLE" step, effectively erasing my original data.
The only idea I have at this point is to create an embedded macro that moves the Data Stream In tool to a different workflow and try to stop it from dropping the original table inside the macro. This seems like overkill but might work if configured correctly.
@phottovy thanks for the workflow. Have you tried to move the Block Until Down earlier in the workflow?
Another option is to set a condition for the data container that the Data Stream In is within.
https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Dynamically-enable-disable-containers/...
A third option if all else fails, refer to a "K99" or Temp table, then once this is loaded with your new data, update the original one.