Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.
Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Error: Input Data (1): Error reading "" Too many fields in record #7

Paulomi
8 - Asteroid

Hi,

 

I am trying to read in a .csv file that has about 211 columns and 1000+ rows. Alteryx throws up an error that says - Too many fields in record #7. I tried changing the field size, but that did not help. I do not want to take the approach of reading in without delimiter and then parsing the data. The data is structured in such a way that parsing and getting the correct field names is a little difficult that way. 

 

Is there a solution to this? 

 

Thanks! 

8 REPLIES 8
messi007
15 - Aurora
15 - Aurora

@Paulomi,

 

Could you share a sample data? It will help us to give you the solution.

 

Regards

Luke_C
17 - Castor
17 - Castor

Hi @Paulomi 

 

Like @messi007 , sample data could be helpful. But for some troubleshooting - this error happens because Alteryx sees 211 columns based on the headers. Row 7 has 212+ fields so the error occurs. Some thoughts:

 

  1. Is there actually extra data in this record? If so you may need to go back to the source to correct it.
  2. Is the delimiter also contained within a field? For example if it's comma delimited and the data in a field is 'ABC Company, Inc' it may interpret that as 2 fields depending on your settings.

If you cannot figure it out please share the sample data + a screenshot of your input configuration

Paulomi
8 - Asteroid

Hi @Luke_C 

 

I have put together a sample dataset; this is as close to the real dataset as possible in terms of variations. 

 

Also, 

 

  • Is there actually extra data in this record? If so you may need to go back to the source to correct it. - It is not possible to manually edit the dataset since it is likely to be run for hundreds of different files
  • I'm reading the file through a Dynamic input tool

Here is a screenshot of the configuration - 

 

Paulomi_0-1629957099285.png

 

Thanks!

Luke_C
17 - Castor
17 - Castor

Hi @Paulomi 

 

You attached an excel file, not csv. Can you share a csv exactly how you would get it? I don't want to make assumptions with how your CSV is set. I do see some data with commas which makes me think that's the issue. Typically when data has commas the fields will often be contained in quotes (hence the setting you have to 'ignore delimiters in quotes'). 

Paulomi
8 - Asteroid

Hey @Luke_C My bad, I attached an .xlsx. I'm attaching a .csv. It has similar content. In my input setting, I am already ignoring delimiters in quotes. What else should I be doing? 

Luke_C
17 - Castor
17 - Castor

Hi @Paulomi 

 

I can't reproduce the issue. I did notice in your earlier screenshot you were starting the import at row 1 instead of where your data actually starts. Maybe that's contributing?

 

Luke_C_0-1630334647667.png

 

 

 

atcodedog05
22 - Nova
22 - Nova

Hi @Paulomi 

 

I was able to replicate your issue. There is a possibility that there is an issue in the line 7. I replaced Asset Number 4324324 to 432,4324 which brought in a extra comma in line 7.

 

Workflow:

atcodedog05_0-1630335835979.png

 

Just look into the mentioned row and fix your data issue. Another alternative is to read it as no delimiter and later convert it to a table.

 

Hope this helps : )

 

Paulomi
8 - Asteroid

I was able to resolve this by reading it without delimiter. But that proved to be a slightly messy solution since there are a lot of dynamic columns and all. Also, to @Luke_C 's point, I require both sets of data - rows 1 to 5 and row 6 onwards. 

Labels
Top Solution Authors