Showing results for 
Search instead for 
Did you mean: 

Alteryx designer Discussions

Find answers, ask questions, and share expertise about Alteryx Designer.

Errors using Output Data to Oracle DB [Designer x64, 11.7]



I'm running Alteryx Designer x64, version on my local machine, and I have this particular workflow that throws an error when I'm trying to write to Oracle. It is a very simple workflow: Input Data to pull from one Oracle source schema, Select to choose which fields to use, and Output Data to write to another Oracle target schema.


The error is not always exactly the same one, and there are times when the workflow completes without a hitch. Say, I'd run it manually ten times in a row, and the workflow would complete in two or three of those tries. It can't be the size of the output 'coz I've had other workflows with much larger output sizes, and they work perfectly.


Below is a list of errors at the Output Data tool that I've encountered in running this same workflow:

  • DataWrap2OCI::SendBatch: ORA-03113: end-of-file on communication channel
  • DataWrap2OCI::SendBatch: ORA-03135: connection lost contact
  • DataWrap2OCI::SendBatch: ORA-12152: TNS:unable to send break message
  • DataWrap2OCI::SendBatch: ORA-12571: TNS:packet writer failure
  • DataWrap2OCI::SendBatch: ORA-12592: TNS:bad packet


Any help or insight would be much appreciated. Thank you for your time. ^_^


If you are writing out to the same database, it would be a good idea to add a Block Until Done tool right before the Output tool.

Sophia Fraticelli
Customer Support Engineer
Alteryx, Inc.

I am experiencing the same.


In my case I am reading from a MySQL DB using ODBC, doing some very minimal stuff, then outputting the data to an Oracle Table using OCI.


Different databases involved in the read and write, but I thought I'd try throwing the "Block Until Done" tool into my workflow as recommended on the thread.  It didn't help my case.  I'm also caching the data in the Input Data tool.


If the size of the data is small (<10k rows) the workflow succeeds regularly.  If the number of rows is increased much beyond that I start to see the the failure as quickly as ~2s to as much as ~8s.


The full data set is a bit over 300K rows.  Failures occur after writing anywhere between 20K to 300K rows.


I should note that I can write the same data using other applications on my local system without any trouble.


Connecting to Oracle 11gR2, InstantClient version 11.2 - x64, Alteryx 11.7.4 x64




I'll provide an update here since I've learned a few more things, though I haven't solved the problem.


I've found a workaround, albeit one that shouldn't be necessary.


First, here is my original workflow that is failing at random points while inserting into Oracle.  (I determined that the Select Tool where I rename fields doesn't play a part by testing the workflow without it and using the same column names in both source and target tables.)



Now the workaround where I first output to a CSV file, then read the CSV file and output to Oracle successfully.



It seems silly that this should be necessary.  I am still considering what this means as far as why the I can't go from MySQL to Oracle using Alteryx.


There are various references around the web that refer to needing the SQLNET.SEND_TIMEOUT parameter set on the Oracle Server.  I don't believe this to be the cause of this issue simply because I can read a table from MySQL and write it to Oracle in a variety of other applications on the same system, including using my own Python code.


While I have been able to work around this particular issue on several occasions, the work-around is never the same. Some solutions that have worked for us include:

  • Using a Data Cleansing tool before output.
  • Changing the data type via the Select tool before output.
  • Deleting the Output Data tool and then adding a new Output Data tool with the same configuration. This tells me that the order when tools are added to the workflow has an effect.

I've also recently encountered an instance of this issue with a weird work-around:

  • The Output Data tool throws an ORA-XXXXX error if it is set to Delete Data & Append.
  • But the tool works just fine if I set it to Create New Table with a Pre-Create SQL Statement that drops and purges the target table (e.g. drop table TABLE_NAME purge).
  • I don't like this particular work-around because I don't feel right when I have to drop a table and then build it again with the very same configuration.


In relation to the recent instance that I mentioned above, I've discovered some new things. Below are some related details:

  • The data leading into this Output Data tool comes to a Block Until Done (BUD) gate.
  • I linked the (1) output of the BUD to this Output Data tool. I linked the (2) output to another tool because the workflow has to do other stuff. My intention is to write the data to a table, and then proceed to use the same data for the rest of the workflow. This table is not used elsewhere in this workflow.
  • If the Output Data tool is set to Delete Data & Append, it throws an ORA-XXXXX error. But it completes successfully if I remove the (2) output, which means the workflow doesn't progress further.
  • If the Output Data tool is set to Create New Table with the Pre-Create SQL Statement to drop the target table, it completes successfully even if the (2) output is connected to the next tool, which means the workflow is completed.
Alteryx Certified Partner

I was experiencing this same issue when trying to insert a large number of records into a table using an OCI connection.  I would get these errors whether overwriting the table, deleting and appending, or creating a totally new table.  I switched the output tool to use the Oracle Bulk Loader ("Oracle Bulk..." under other databases) and this completely resolved the issue.


Unfortunately, even changing the load option to Bulk loading doesnt work for me :-(.


It is weird, what works once doesnt work the next time, and I am still stuck with this issue.