Hi Everyone, I am a regular Alteyrx user & rely on community for most of question I encounter on daily basis.
I am facing an issue when it comes to importing large string data fields using Input tool & would appreciate your suggestions/advice. Here is brief explanation:
For a particular family group of items, a list of items is concatenated (by comma separator) & then written in CSV using Output tool . When validated in workflow, you can see all the concatenated items & also data type V_WString with specified max length Alteryx can deal with i.e. 2.14Bn characters.
But when same CSV is imported in another workflow, the input tool is overwriting the V_WString to standard V_String with 254 characters by automatically dropping items in the concatenated fields.
For example, if family group A has 100 items concatenated in single V_WString & outputted in CSV, when imported using Input tool in another workflow, it randomly drops 80 items & standardizes data type from V_WString to V_String.
Another problem here is you can not open this file to validate in Excel as it will try to make sense of data & mess up with the formatting.
Awaiting for valuable feedback !
Solved! Go to Solution.
It sounds like your data is becoming truncated with the import? In the Input tool configuration, can you try changing the field length to something larger?
Awesome, Thanks @echuong1 !