Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Blob Insert into Oracle DB - "Invalid Buffer Length" when Blob is >1GB

271828
8 - Asteroid

Alteryx is not allowing me to write Blobs to an Oracle DB (via an OCI connection) when the file size is greater than 1GB. Works fine for <1GB. Doesn't seem to matter what the file type is - I get the below error with both .qvx and .csv files. According to our DBAs the Oracle DB is configured to allow Blobs of at least 2TB.

 

And suggestions on a remedy? I know the Blob tool has the 'break into chunks' feature, which works, but we're trying to avoid the end user having to do any ETL if possible.

 

 

Error: Output Data (4): DataWrap2OCI::SendBatch: ORA-03146: invalid buffer length for TTC field

Insert into "SCHEMA"."TABLE"("INSERT_DATE","FILE_NAME","FILE_DATA") Values (:A1,:A2,EMPTY_BLOB())

3 REPLIES 3
TrevorS
Alteryx Alumni (Retired)

Hello @271828 

I would recommend starting with Oracle as this appears to be an Oracle related error message.

https://support.oracle.com/knowledge/Middleware/2571462_1.html

 

From this other link:
https://stackoverflow.com/questions/47413271/ora-03146-invalid-buffer-length-for-ttc-field
"ORA-03146: Invalid buffer length for TTC field
Cause: A corrupt Two-Task Common (TTC) packet was received.
Action: This is an internal protocol error. Contact Oracle Support Services."

 

I hope this helps to get you in the right direction!
Thanks,
TrevorS

Community Moderator
271828
8 - Asteroid

This is apparently a bug in the Oracle DB. Despite the ORA error, our DBAs thought this was coming from the Alteryx side of things, hence my original post.

 

We're having to fix this through Oracle support so it's not a particularly quick resolution. 

271828
8 - Asteroid

An update on this, and hoping someone has some insight as this has been ongoing for 2 months. I made some 'progress' yesterday by changing the connection from an OCI to an ODBC that I set up as a system DSN. 

 

Alteryx seemingly inserts the blob with no issue, but then when I read the table, the blob column is null. Even though there's no error, it's not writing the blob. The cutoff is still ~1GB. Under that and I can read it fine, as well as output to a local drive. Over that and it's just a null. The default size is 2.1B bytes, which shouldn't even be a factor as this particular test file is only 1.1GB.

 

 

colorado970_0-1617291482273.png

 

colorado970_1-1617291532119.png

 

This is the workflow I'm using to insert the blob, basically just grabbing a file from a filtered directory.

 

colorado970_2-1617291665618.png

 

Labels