Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Run multiple/complex queries in Teradata through Alteryx

JackeyCJQ
5 - Atom

Hi Everyone,

 

I am currently in the process of translating a large complex piece of code in Teradata left by my predecessor into an Alteryx workflow.

 

This code has various layers which is quite generic by Teradata standards:

  1. Create volatile tables from data in existing data warehouses/sandpit
  2. Use these tables + data from data prep to create other volatile tables or use as Where-In clauses
  3. There are Updates, Alters, Insert steps as well
  4. Final Output are multiple permanent tables (which is used to produce reports for stakeholders)

 

I figured rather than trying to recreate all 1475 lines of code into Alteryx tools, is there a way for me to say, "at this point, run all these codes in Teradata" and then I can come back to use the subsequent final outputs to produce my end report?

 

A part of me feels like bat/bteq files may work, however, due to some limitations in my workplace, bteq is not available for my colleagues (if I was away and they need to run). I would like to explore potential alternatives.

 

Below is a (very poor) representation of what I am looking to achieve:

 

 

 Capture.JPG

 

Thank you in advance

6 REPLIES 6
MatthewO
Alteryx Alumni (Retired)

@JackeyCJQ have you explored using the Pre SQL configuration in the Input Data tool? This allows code to be executed before the query. You could potentially have the scripts run and read the results into Alteryx through the input data tool. The following article contains for information about these options: https://community.alteryx.com/t5/Alteryx-Designer-Knowledge-Base/Alteryx-Pre-Post-SQL-Statements/ta-...

 

 

JackeyCJQ
5 - Atom

Thank you for the suggestion, I will experiment with it and get back to you if it solves this problem.

JackeyCJQ
5 - Atom

Hi MatthewO

 

This solution fits well when my SQL statement is at the start of my workflow. Perhaps I can break my workflow into two bits: 1) Update the data (which will be imported into Teradata), 2) Run the big SQL codes.

 

That said, is there a way to bridge that process without separating into two distinct workflows?

MatthewO
Alteryx Alumni (Retired)

@JackeyCJQ I would recommend exploring the Dynamic Input tool. This tool accepts input from a data stream and could be run after the input data tool. 

JackeyCJQ
5 - Atom

Thank you Matthew.

 

This solution did the primaries for me in that it did go to Teradata and run all the codes.

As for multiple branching outputs, I ran secondary tools to utilise the outputs individually.

Birinder_Sangha
5 - Atom
JackeyCJQ would you please share in detail if you found a solution. I am in process of similar scenario. Thanks in advance
Labels
Top Solution Authors