Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Dynamic Input - "Ignore Delimeters In" - None versus Auto

Anweinbe
7 - Meteor

Hi all,

 

I am inputting data from the SEC website that is quarterly financial statement related. For 3 years everything is fine, but we are now hitting a few errors that say "Too many fields in record #381110". There are 3 of them. I noticed that my entire workflow runs fine... but the output is missing a TON of data. Originally my dynamic input was set to "None" in the "Ignore Delimeters In".

 

I updated the "None" to "Auto" and the errors went way, and my output from my flow has a lot more results... which I think fixed it. What I don't understand though is what the difference between these are. By selecting "Auto", am I now introducing something else into my workflow that could cause issues elsewhere? I want to make sure I understand the change I made and why it appears to be working better.

 

Thanks!

1 REPLY 1
DavidSkaife
14 - Magnetar

Hey @Anweinbe 

 

Imagine you're importing data from CSV and you're using a comma as the delimiter to break it down into the required separate columns. Now imagine you have a column that contains someone's name, like this

 

Skaife, David

 

What would happen is this would be split into two columns, and throw your data out, and potentially make certain records longer than they need to be.

 

To overcome this what can happen in these instances is this section of data will be 'wrapped', often with quotes so your name section will look like this

 

"Skaife, David"

 

When you set the 'ignore delimiters in' to None it will do as it says on the tin, and ignore them so this would still split out into two separate columns. Setting it to Auto tells Alteryx to automatically detect this, and ignore the comma within the quotes.

 

At a guess the SEC download has had some minor changes to its data, which is now causing you this error. By changing it to Auto you're removing the error, and yes there could be an issue elsewhere but if the workflow runs fine then you 'should' be ok. Its ALWAYS worth double checking your output is as you expect though, especially given a structure change to the data. 

 

Hope this helps!

Labels
Top Solution Authors