I need to execute a bunch of SQL scripts that need to be executed in order as they are dependent upon each other. The workflow modifies some of the date values within each script. The scripts are stored locally(this is a requirement).
My plan was to feed each script into a dynamic input in a control container and connect the control containers to control the sequence order. These scripts are fully self-contained and do all the DROPS, CREATES, and WRITES.
The tables are correctly read from/written to, but I am getting the error "No columns returned". Based on some google-fu this because there are no results being returned; which makes sense because this is true. I've tried several methods of adding an additional statement on the end to get some "results", but that hasn't worked.
SQL attached below. Anyone have any other ideas?
-- Step 1: Clear all data from the destination table.
-- TRUNCATE TABLE is used for performance as it's a minimally logged operation.
TRUNCATE TABLE [Sandbox].[dbo].[WRITE_TEST_2];
-- Step 2: Copy all data from the source table to the destination table.
-- It assumes the column structure of both tables is identical.
INSERT INTO [Sandbox].[dbo].[WRITE_TEST_2]
SELECT *
FROM [Sandbox].[dbo].[WRITE_TEST_1]
WHERE BegindDate = '1901-01-01' AND EndDate = '1901-12-31';
-- Step 3: Return a result set to avoid the "no columns returned" error.
-- @@ROWCOUNT returns the number of rows affected by the last statement (the INSERT).
-- This provides a meaningful result and confirms the operation's success.
SELECT @@ROWCOUNT AS RowsCopied;
Yes, Control Containers could be used to sequence.
I am curious though why this approach? Maybe i'm overly simplifying this but could you:
To avoid the empty rows issue, the Control Container would work well to perform a conditional test.
-Jay
@jrlindem The sql script I gave is just a very small example. The actual scripts are hundreds of lines and do multiple drops, creates, etc... We need the scripts to exist on a shared drive for transparency purposes that can be edited outside of Alteryx. In this case, Alteryx is just being used to execute a series of physical scripts in a specific order. No ETL is being done in Alteryx other than changing some date variables in the scripts.
@csh8428
Ya, I kind of suspected that was going to be the case. Given that you have "edit" permissions back to your database, I feel like (if it was me) I'd probably opt to build out your solutions using procedures/tasks in... dare I say, the database itself instead of in Alteryx 😬
@jrlindem I'm gonna pretend you didn't say that. ;)
Does it matter as to whether anything has completed with the script, or is it just running them in order? I would feed the scripts into a batch Macro, so that way one script will execute at a time, and then the next etc.
I think you have the right method in terms of using @@ROWCount so that something is returned. If you want to do it without returning something, then it will require a little mucking around, as that's not the way Alteryx works. It could definitely be done, but wouldn't be as native and would be harder to track/update.
Some other points:
As to running them, you have the option of (I advise Dynamic Input):
@KGT Yes, each successive script must be completed before the next one launches as many are dependent upon each other. If that didn't matter I would just use the Run Command tool to launch powershell scripts.
OK, then I recommend a Batch Macro, so that one executes at a time. There are so many different ways you could do that, but the simple method will have the Batch Macro taking the script as the control parameter each batch.
I recommend keeping your @@ROWCOUNT, as that will be an output for each batch. You could make this fancier by customising the output on each script and having that return, but that's not the main focus here.
User | Count |
---|---|
106 | |
85 | |
76 | |
54 | |
40 |