Hi,
I have a very simple workflow and very small dataset that simply reads a CSV with 31 lines (weather data), makes a few small calculations, and then writes to a database table using 'update; insert if new'.
I've included a screenshot of the error below:
I have no joins, no unions, the dataset is small; so memory issues should not be a problem. The database table was created using the very same workflow; the only different now is the writing method was changed from 'create table' to 'update; insert if new' (with the primary key added).
Does anyone have any ideas?
Solved! Go to Solution.
Hi @dandev91,
Thank you for posting on the Community forum. Can you please provide your post below in an email to support@alteryx.com so we can further assist you? Please mention John Posada in the email, so that I may assist you.
I have received this error for the following reasons (separate issues):
1. There's a delimiter within one of the fields itself, in which case configure the input tool as "ignore delimiters in quotes".
2. The table it's writing too, if it were to complete it's write, will exceed the amount of space available on the database.
3. There's a string field in the table you're writing to with a length, of say, "X", and one of the data values for that field coming in from your Alteryx workflow is of size "bigger than X".
Hi
It may have something to do with the datatypes. I get the same error when reading a time column from the database. When the column is removed from the select, the error disapears. Try to change datatype with the select toll and proceed with your calculations
Regards Mats
Stockholm, Sweden
It was number 3 in your list.
Thank you again for your assistance and sorry for the delay in accepting your response as the solution!
I realize this is a very old post, but it saved me today from receiving this error and having no idea why it started to happen with a flow that previously worked. The issue was a field length exceeding the value in an Oracle table that I am writing to. Just wanted to say thanks for the numbered item list you provided as it enabled me to solve this perplexing error.