Error: Input Data (1): Error reading "" Too many fields in record #7
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi,
I am trying to read in a .csv file that has about 211 columns and 1000+ rows. Alteryx throws up an error that says - Too many fields in record #7. I tried changing the field size, but that did not help. I do not want to take the approach of reading in without delimiter and then parsing the data. The data is structured in such a way that parsing and getting the correct field names is a little difficult that way.
Is there a solution to this?
Thanks!
- Labels:
- Input
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @Paulomi
Like @messi007 , sample data could be helpful. But for some troubleshooting - this error happens because Alteryx sees 211 columns based on the headers. Row 7 has 212+ fields so the error occurs. Some thoughts:
- Is there actually extra data in this record? If so you may need to go back to the source to correct it.
- Is the delimiter also contained within a field? For example if it's comma delimited and the data in a field is 'ABC Company, Inc' it may interpret that as 2 fields depending on your settings.
If you cannot figure it out please share the sample data + a screenshot of your input configuration
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @Luke_C
I have put together a sample dataset; this is as close to the real dataset as possible in terms of variations.
Also,
- Is there actually extra data in this record? If so you may need to go back to the source to correct it. - It is not possible to manually edit the dataset since it is likely to be run for hundreds of different files
- I'm reading the file through a Dynamic input tool
Here is a screenshot of the configuration -
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @Paulomi
You attached an excel file, not csv. Can you share a csv exactly how you would get it? I don't want to make assumptions with how your CSV is set. I do see some data with commas which makes me think that's the issue. Typically when data has commas the fields will often be contained in quotes (hence the setting you have to 'ignore delimiters in quotes').
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hey @Luke_C My bad, I attached an .xlsx. I'm attaching a .csv. It has similar content. In my input setting, I am already ignoring delimiters in quotes. What else should I be doing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @Paulomi
I can't reproduce the issue. I did notice in your earlier screenshot you were starting the import at row 1 instead of where your data actually starts. Maybe that's contributing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @Paulomi
I was able to replicate your issue. There is a possibility that there is an issue in the line 7. I replaced Asset Number 4324324 to 432,4324 which brought in a extra comma in line 7.
Workflow:
Just look into the mentioned row and fix your data issue. Another alternative is to read it as no delimiter and later convert it to a table.
Hope this helps : )
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I was able to resolve this by reading it without delimiter. But that proved to be a slightly messy solution since there are a lot of dynamic columns and all. Also, to @Luke_C 's point, I require both sets of data - rows 1 to 5 and row 6 onwards.
