Hi, I'm searching for a way to simply execute generic SQL commands against a server. Specifically, I'd like to execute some stored procedures on the server after loading data into staging tables using alteryx.
I know you can use custom SQL as part of data input connectors, but I'm just looking for a way to run other SQL statements against a server such as exec and delete statements, etc.
Solved! Go to Solution.
Should mention that the batch target gets cleared once prior to the 100 tables being appended to it
A couple of questions...
The reason I'm asking is because based on what I'm currently understanding of your progress, I'm thinking you could eliminate the "staging" table where you accumulate all of the data and just use a Dynamic Input with an Output that separates into the individual tables. That could eliminate the batch process altogether (or at least put the Dynamic Input and Output into a batch macro).
No I am obviously not explaining it well, sorry
The target never gets split, this is the final table for analysis in Tableau (for instance)
Think of it like this: (this isnt whats happening but it is similar) you have 100 stores, they all have identical spreadhseets and have to do a budget for you.
You take all 100 excel budgets and load them all to a single table to analyse all stores aganst each other
Well, or maybe I'm just "thick".
So since in your screenshot there is no Output from the batch macro, I assume that you are writing to an Output within the macro with an Output Option of "Append Existing"?
Since a batch macro essentially "unions" all of the records anyway, could you not bring the Output outside of the macro (replace it inside the macro with a Macro Output tool) with the Output Option of "Overwrite Table (Drop)" and just let Alteryx write out to the single table?
Just an aside: you could use sqldf (or maybe RODBC) in the R Command tool if you wanted to simply execute some SQL.
You could use that (I believe) if you wanted to do something like drop a table.
Also, you obviously have the loading stuff working inside the batch macro, but you might want to check out using the Dynamic Input tool. That is what I would typically use for the example of having 100 tables I want to combine into one. I would feed the list of table names into the Dynamic Input and modify the SQL Query in it so that it changes the query string for each record coming in. (It basically functions as a batch process this way.) I would then just attach an Output tool with the Drop option.
So since in your screenshot there is no Output from the batch macro, I assume that you are writing to an Output within the macro with an Output Option of "Append Existing"?
Since a batch macro essentially "unions" all of the records anyway, could you not bring the Output outside of the macro (replace it inside the macro with a Macro Output tool) with the Output Option of "Overwrite Table (Drop)" and just let Alteryx write out to the single table?
I am a newby, and macros are still all new to me, I will investigate this further
are you saying that with an output option in the batch macro, I could have all the output write to a single table that gets dropped once at the start of the batch and repopulated with all of the content (the union of all of my batches)? If yes then this is what I want.
If it deletes on each "batch" then it is not what I need
I will investigate
Also, you obviously have the loading stuff working inside the batch macro, but you might want to check out using the Dynamic Input tool. That is what I would typically use for the example of having 100 tables I want to combine into one. I would feed the list of table names into the Dynamic Input and modify the SQL Query in it so that it changes the query string for each record coming in. (It basically functions as a batch process this way.) I would then just attach an Output tool with the Drop option.
No it isnt that simple I'm afraid. I mislead you with my 100 excel sheets example I think. It is actually more like 100 URL API calls with multiple XML parsing tools on each batch
Just an aside: you could use sqldf (or maybe RODBC) in the R Command tool if you wanted to simply execute some SQL.
You could use that (I believe) if you wanted to do something like drop a table.
Thanks, I will check this out
No problems with being a "newby"...that's how we all started.
Hopefully this will help...
At the end of all of the process you have in your batch macro, you would end it with a Macro Output tool. (I've just taken a screenshot of a batch macro I've created in the past with lots of parsing/blending/manipulation as an example. I assume your's might be even more complex.)
In the Properties of the batch macro (under the Interface Designer view), you can set up how you want the data to come out of the macro. (If you've used the Union tool already, you will recognize the options.)
If the schema for each URL read is the same, you would typically use the first (default) option.
Your batch macro will now have an output anchor that you can attach an Output tool to. This is where you would write to the "all records" table and you would set the options in the Output tool to "Overwrite Table (Drop)".
Hope this all makes sense...