Hi,
I am using In-DB "Data Stream In" tool to bring in 1M rows into Teradata. I am using "Overwrite Table" in the creation mode. I am getting the following error when i run the workflow:
DataWrap2ODBC::SendBatch:[Teradata][ODBC Teradata Driver]SQL request exceeds maximum allowed length of 1 MB Insert into "databasename"."tablename"("columnname") Values
I tested the write by limiting the rows to 20,000 and i get this error:
DataWrap2ODBC::SendBatch: [Teradata][ODBC Teradata Driver][Teradata Database] Too many data records packed in one USING row. Insert into "databasename"."tablename"("columnname") Values
It looks like i cannot write more than 16,383 rows. If i limit to this then it's successfull. If i changeto 16,384 then it ends in error with the above message.
Please advise.
Solved! Go to Solution.
Hi @dineshp
Why don't you use a regular Output Tool to create this table? Is there any reason for that?
The first error is due to the length of your SQL statement. When Alteryx converts your data to a SQL statement to create the table in Teradata, this statement is probably too long. This might be due to the fact your insert clause is taking in consideration too many rows.
Which version of ODBC driver are you using? And which teradata database version are you in?
Cheers,
Hi @Thableaus
This is a part of workflow. I am trying to bring it inside In-DB for my workflow (That's why Stream In tool). I am not writing any sql statement everything is automatically carried out by Alteryx. It works for 2 of my colleagues. Not sure what's wrong. We all have identify drivers and setup.
Teradata & Driver is 15.10 version
The SQL Statement I mentioned is what Alteryx does when it brings your data into the Teradata table.
Alteryx automatically converts your data into a SQL Statement to create the table so it can "communicate" to the database.
This statement can't be so long and it can't exceed 1MB.
Cheers,
@Thableaus This workflow works for my colleagues, the error in insert is only happening to me. So i don't see this exceeding 1MB
Issue resolved after reinstallation of SQL Assist 10.50 and increasing the commit transaction size to 16,383.
@dineshp have you streamed into TD as a temp table? If yes, did you have to get special permissions from the DBA?
Would be curious to hear you experience.
I don’t think we can write to a TD Temp Table using the output directly. May be write to a physical table and then to a Temp Table using PostSQL.