Hi Everyone,
I am currently in the process of translating a large complex piece of code in Teradata left by my predecessor into an Alteryx workflow.
This code has various layers which is quite generic by Teradata standards:
I figured rather than trying to recreate all 1475 lines of code into Alteryx tools, is there a way for me to say, "at this point, run all these codes in Teradata" and then I can come back to use the subsequent final outputs to produce my end report?
A part of me feels like bat/bteq files may work, however, due to some limitations in my workplace, bteq is not available for my colleagues (if I was away and they need to run). I would like to explore potential alternatives.
Below is a (very poor) representation of what I am looking to achieve:
Thank you in advance
Solved! Go to Solution.
@JackeyCJQ have you explored using the Pre SQL configuration in the Input Data tool? This allows code to be executed before the query. You could potentially have the scripts run and read the results into Alteryx through the input data tool. The following article contains for information about these options: https://community.alteryx.com/t5/Alteryx-Designer-Knowledge-Base/Alteryx-Pre-Post-SQL-Statements/ta-...
Thank you for the suggestion, I will experiment with it and get back to you if it solves this problem.
Hi MatthewO
This solution fits well when my SQL statement is at the start of my workflow. Perhaps I can break my workflow into two bits: 1) Update the data (which will be imported into Teradata), 2) Run the big SQL codes.
That said, is there a way to bridge that process without separating into two distinct workflows?
@JackeyCJQ I would recommend exploring the Dynamic Input tool. This tool accepts input from a data stream and could be run after the input data tool.
Thank you Matthew.
This solution did the primaries for me in that it did go to Teradata and run all the codes.
As for multiple branching outputs, I ran secondary tools to utilise the outputs individually.