Problem I'm Trying to Solve with Alteryx
I have multiple queries I have to manually run every morning. They update datasets that the team relies on, and I would like to automate them to run every morning without manual involvement from me.
Technical Setup
We are running Teradata 16.2 and Alteryx Designer 2021.3.2.54175.
The queries aren't huge, but they are on the larger side (1k - 2k lines). They combine data from multiple tables (all Teradata) and involve a not-insignificant amount of data transformations and joins to get into their final form. Once the final dataset is created, the table it will replace is dropped and recreated with the new dataset.
The complexity of the queries are managed by using volatile tables (10 - 20 per query). These all run perfectly well when run as a continuous statement in Teradata Studio. In Alteryx, I'm running most of the SQL in the Input Data step's Pre SQL Statement, using volatile tables to store data as needed, and ultimately combining the volatile tables into a final volatile table that will be pulled in the Input Data step's SQL Statement.
Here is the structure all my volatile tables use. Occasionally I use CTEs within the creation of the volatile table.
CREATE MULTISET VOLATILE TABLE VT_Name
AS
(
SELECT *
FROM DATABASE.DATATABLE
)
WITH DATA
ON COMMIT PRESERVE ROWS;
Alteryx Issue
The Pre SQL Statement never runs to completion. I keep getting one particular error.
3932 Only an ET or null statement is legal after a DDL Statement.
The odd thing about this is that I can create one, two, or sometimes three volatile tables before the error happens. There is nothing different about the syntax used to create any of the volatile tables. However, once I added the fourth volatile table's code into the Pre SQL Statement, the error starts appearing.
Does anyone have any insight into how I can solve this issue?
Thanks!
Solved! Go to Solution.
I'm not a Teradata expert, but from looking at the docs, your error is related to not explicitly wrapping your DDL in a transaction(BT...ET). Since the operation works in Teradata Studio(TS), it might be related to the fact that in Alteryx the pre-SQL and SQL parts of an input tools are not group into transactions the same way as they are in TS.
One option you can look into is using global temporary(GT) tables as opposed to volatile tables. Using GTs might eliminate the need for DDL statements(at least for defining the GTs) and allow you to schedule everything.
Dan
@danilang, you were completely correct. Once I wrapped each volatile table creation statement in the BT / ET statements, my errors went away. I wasn't aware of these statements. Thank you so much for bringing them to my attention. Much much appreciated.
Hello, I am trying to do a very similar workflow in Alteryx. I have never used the BT/ET statements. Could you provide an example of how you incorporate these statements in a query? Also is the preSQL portion of the input tool needed when using these statements?
My current query looks similar to this:
CREATE MULTISET VOLATILE TABLE *table name* AS (select *****) WITH **** ON COMMIT PRESERVE ROWS;
CREATE MULTISET VOLATILE TABLE *table name* AS
(
select *****)
WITH ****
ON COMMIT PRESERVE ROWS;
*insert many more similar statements*
with ** as ()
Select **
Order by **
Is there something specific that needs to be done with the ending statement as well or does that just get tagged on a the end?
Thank you in advance!
BEGIN TRANSACTION;
CREATE MULTISET VOLATILE TABLE TABLE_NAME AS
(
YOUR QUERY
)
WITH DATA
ON COMMIT PRESERVE ROWS;
END TRANSACTION;
Do this for each volatile table you want to create. These statements work within the pre-SQL statement. They should also work in SQL and Post-SQL statements.
This is the error I seem to keep getting. Is BT/ET needed for every create statement and/or insert statement within the query?
Error SQLPrepare: [Teradata][ODBC Teradata Driver][Teradata Database] Data definition not valid unless solitary.
I can't answer that question for sure, but I do believe it is. Like stated above, it is needed for each volatile table creation. Experiment around with using those for each unique data definition statement.