This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I am looking for advice on how to set a workflow's database connections at runtime based on UI selections.
We have a standard workflow that is run regularly / as needed against any of a few dozen SQL Server databases, which share the same table/proc/etc structure, and are located on several different servers. In the past, manual operations have been used to export a file from a database for the workflow input, and to import the workflow output back to SQL Server.
We would like to replace the manual export/import operations with parameterized stored procedures that are called from the workflow to read and write data in the database of interest. Instead of manually configuring the db connection for each of the procs each time we run this against a new or different database, we would like to have all of the connection strings' server and database names be set in one place, preferably based on a drop-down list selection.
I am looking for guidance on how to approach this. I've been working with Alteryx for years but am fairly new to analytic apps. I've built a series of chained apps (some UI and data inputs are dependent on choices made in prior steps) that handles most of what I need, including pulling database and server names from a table based on UI choices, but I'm having trouble finding how to use these values to set the workflow's proc connections at runtime. What is the proper way to do this?
If I'm getting your point right, my approach would be to create a Batch Macro, having the connections'list to be used as the mapped Control Parameter (if I'm not, please correct me and we'll figure out a solution).
In your workflow, just filter the connections you'll use (or get the user select them from a dropdown tool in a UI) and then call the macro to do the magic.
I'm attaching a simple macro and workflow, that takes a list of connections, and opens them and save a .yxdb file in the disk with every connection content.
You can start building up from here.
Hope this help pointing you in the right direction.
Hi @aguisande, Thank you for the reply. After looking into this more I realize I should probably ask a different question.
I actually had this process working well, until a change on the database side broke it. The previously-working version is a series of three apps chained together. The first accepts the user's inputs through dropdowns and text boxes, and saves these parameter values to text files in the app directory to be read by the downstream apps in the chain. Two of those parameters are the name of the SQL Server database, and the server on which it resides (e.g., 192.168.xxx.xxx).
The second and third apps in the chain call various stored procedures which read and write records in the database. Each proc is called via a Dynamic Input tool using an OLEDB connection. A pair of Action tools use 'Update Value - Replace a specific string' to update the database name and server IP address portions of the OLEDB connection string. A Text Input tool passes a parameter value to the Dynamic Input, which passes it to the proc via 'Modify SQL Query - Update Stored Procedure'.
The process described above worked ok until a security upgrade on the database side (TLS1.0 -> TLS1.2) caused OLEDB to stop working. After discussing with our network staff I decided to use ODBC instead. I think this will require a different approach, since now the Dynamic Input data source has something like "aka:ODBC...." rather than the explicit connection string containing the database name and server address that the Action tools are configured to replace. So I'm curious how to update the connection at runtime when it is listed in "aka:ODBC...." format in the Dynamic Input tool?