Hi All,
I am trying to output data from Alteryx output tool to SQL Server database. I would like to delete the old records(which I am trying to insert) for that particular period from database before insert.
Solved! Go to Solution.
Hi Jeeva,
There is no way to parameterize your pre or post SQL in Aleteryx , but yes there are some tricky way to achieve this
What you can do is, create a control table on you data base, something like below and insert the file name what you have if you are dealing with multiple source files
create table ETLcontrol (filename varchar(50),filedate date)
insert ETLcontrol values('Sample',null)
Then configure both post and pre sql with below SQL
--post SQL
Update ETLcontrol
set filedate = (select max(FileDate) from Test)
where filename='Sample'
--Pre SQL
delete Test
where FileDate=(select top 1 filedate from ETLcontrol where filename='Sample')
So after you configure this and when you run it first time it will load the date into your destination and also POST sql will update the control table for each file with the latest date processed
And when you run it next time PRE SQL will look for the date to be deleted and before you load it will delete the rows. And you need to customize SQL as per your requirement for control table
Hope this may help you to start your SQL with Alteryx
PFA sample for you testing
Hi. Thanks for your reply. But my file name will be different every month and so I cannot depend on that. My only option here is to use the date column from the incoming data.. :(
Jeeva,
I think there is a workaround still
What you can do is, Grab the File name also from your input and pass it throughout your workflow and add a column name in your Destination table and populate the file name
Then modify your both Post and Pre SQL statements
you can see the same in the file that i attached attached in my earlier reply
Hi,
Thanks for your reply. Yes that makes sense. I tried to implement this, but wanted to just optimize and have one single workflow instead of one to update the min and max date in ETLcontrol table and another one to run the actual workflow.
Since the input data is huge, having two workflows to read the same input takes twice the time. Is it possible to read in and write the min and max DATE with file name to ETLcontrol and make the remaining part of the workflow to wait till this happens. Then fetch date from this ETLcontrol table and update accordingly. I thought of using block until done, but not sure if that waits till the date insert happens. Also not sure if I can pass data out of the block until done tool.
Regards,
Jeeva.
Hi
yes you can try with Block untill done
Also give a try with another Macro developed by Adam. Download it from the below Link for "ParallelBlockUntilDone.yxmc"
http://www.chaosreignswithin.com/p/macros.html
The solution here is to use a Batch Macro. The Batch Macro, rather than
doing any batching, is just used to make the SQL statement dynamic - this
is a fantastic use of batch macro and is often overlooked.
Configure the output module and connect two control parameters with actions
to update the start and end dates of the preSQL statement. The macro itself
can consist of just two "data" tools, a macro input and an output tool - as
well as the 4 interface tools (2 actions and 2 control parameters).
Now in the parent module add the macro (which will have two inputs - the
control parameters and the main data feed) and connect the min and max
values from a Summarise to the control parameters, and feed the data into
the macro input.
Hope that makes sense, I'm on the road today but I can build an example
module tonight if needed.
Regards
Chris
Great Idea Chris
i am waiting to see your example :-) as this is a common need for every developer who deal with Database using Alteryx :)
Attached.
Typically I'd go a step futher and have a log table updated in the post sql script.
In a lot of cases I'd also use a control table to control which tables I load and the changes to the database, but it looks unnecessary in this case. Alteryx makes a great ETL for these situations with the right implementation.
That is perfect. It works well. Thanks a lot Chris and pichaipillai for your ideas.