I am trying to use the input tool to connect to a SQL database. It has been truncating fields to <256 characters. I have tried:
1.) To use a SELECT tool to change the data types to 'WString' and increase the size to 10,000. This did not work.
2.) Looked at the INPUT tool for an option of increasing character length/limit when reading in the data..but I do not an option for this.
3.) I am trying to do this with the regular INPUT tool, but also wondering if anyone knows if this can be accomplished with the 'In-Database' INPUT tool as well.
Thank you.
hi @vujennyfer
I have worked with big text in sql and I didn't have problems, but I always use to read the oldb driver.
How long is your text field?
Where are you veryfing the lenght of the field loaded?
Hi @randreag , thanks for the reply.
1.) I'm using PostgresSQL ODBC. Unicode x64 Driver. Do you think this is the issue? I have tried the ANSI x64 driver in the past and ran into the error " Input Data (54) Error SQLExecute: ERROR: character with byte sequence 0xe2 0x80 0x8b in encoding "UTF8" has no equivalent in encoding "WIN1252";¶Error while executing the query"
2.) The field is not very long. It is a description field with max character length of 1,996.
3.) For verifying length of field, I'm looking in SQL and seeing the full field description but when I use the INPUT and BROWSE tool in Alteryx, I see the field has been truncated from the BROWSE view.
In the browse tool, it may give you a red arrow and say it was truncated for display purposes only, but if you actually export it or double click into it you should see the full text. The important thing is if you are getting any workflow warnings that indicate data is being truncated.
I am having the same issue, that's why I ended up here.
I think this is due to the configuration of the PostgresSQL ODBC driver
Hower, changing these settings did not work for me 😥
But perhaps it can help you further.
Br (ツ)
Thomas
Hi All,
I have the same issue - my columns coming from Azure Databricks are being truncated to 255 characters. Changing this setting from 255 to 10000 did not resolve the issue.
Original size in Databricks is over 2k characters, but Alteryx defaults to 255. When I read from .csv I am able to read fields with values longer than 255, it's just ODBC that doesn't work
This article Defect GCSE-687: Data Returned by Databricks Query on Long Text Fields Truncates to 254 Characters (... suggests driver 2.6.23 I have a newer driver version 2.6.29.1049 (64 bit) however the issue persist
Any ideas?
Thanks