This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I am trying to update a sql server database table dynamically. The purpose of this is to log to a database table for our alteryx modules and by using generic column names the table can be used multiple times. To determine which column is to be updated a lookup table is used.
So to do this I have created a number of filters 1 to 50 and the output tool updates that one column. However this means a lot of duplication and also each filter fires which will be too much of an overhead.
If I could push this back to the sql server database I would create the update statement dynamically and use the execute command. However, I can't seem to this use the 'run command' component as I am unable to pass the values from the alteryx workflow which I am trying to log. The 'Output Tool' Post or Pre SQL filed has the same problem.
I don’t want to use an app to do this, not that I know a lot about apps but want the code to be held on our server and not remotely.
Can you please let me know if there is an alternative to the above solution?
Do you have 'write' permissions to the database for which you are trying to do this? I'd test that first by using your favorite sql editor and doing a simple 'CREATE myTempTable' type on your schema to make sure you have the correct rights.
If you do, then you should be able to use an ouput tool to write the results of your data flow to a table on that SQL server.
Yes I have write permission and this code is working. This is more of a best practice questions.
What I am asking is whether this can be simplified as the filters I have screenshot is only a small section I have 50 columns in the table so therefore 50 filters. To me this is duplication of code and effort. I hope there is an easier way.
In that case, I might use a select tool (to rename columns) followed by a Union tool after all the filters to build a single table and write that back to the database just 1 time (vs doing 50 writes that only add 1 thing each time)
I have thought about this and would prefer to keep them as seperate updates.
The logging will capture data (eg row counts, field values) throughout the module not just at the end. The logging code is held in a macro and I want it to be called multiple times in an alteryx module (up to 50 times). If the calling Module kept all values as it passed through it and call the macro which updated all the columns it would not flexible. For example the order of the module components would have to fire in the same order and the development of the logging would be time consuming and may overwrite values.