Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Too many fields in record#1

nhunter
7 - Meteor

I am trying to read a pipe delimited file and I am receiving the error above.
The header row is 1,000 characters and contains 60 fields.

There is a total of 24,173 rows in the file.

 

The error message doesn't make much sense. Is there an upper limit on the number of fields in a delimited data file?

FWIW. Excel has no trouble reading and correctly parsing the data.

28 REPLIES 28
nhunter
7 - Meteor

It seems that if I append a pipe (delimiter) character to the end of the header row, the file is correctly parsed. However, an empty field is appended to the result set. 

 

Capture.PNG

 

This observation doesn't help much though. I still need to be able to read the files as they stand.

MarqueeCrew
20 - Arcturus
20 - Arcturus

Hi @nhunter!

 

If an explanation helps, I can explain the behavior.  As for Excel, I'll put in my two cents.

 

If your data looks like:

 

Name|Address|Phone

Mark|Michigan|

Mark|Michigan

Mark|Michigan|555-1212|888-123-4567

 

Excel might parse this out with a blank field name and show NULL values appropriately.  Excel would also parse out this row incorrectly:

Mark||Michigan|555-1212

It would move Michigan into the phone field because of the extra |.

 

When data is expected to conform to a consistent format, Alteryx prevents you from loading the data when the wrong count of fields is encountered.

 

If you post a file with the first 2 or 3 records only.  I'd be happy (or one of many others helping here) to take a look and see what we recommend specifically for you.  My first priority is to help you with the issue.  Once that is done, an idea for future enhancements can be made to be more forgiving of delimited files and allow the data to be read with caution.

 

Cheers,

Mark

Alteryx ACE & Top Community Contributor

Chaos reigns within. Repent, reflect and restart. Order shall return.
Please Subscribe to my youTube channel.
nhunter
7 - Meteor

Thanks, Mark.

 

Here's a small sample.

 

I had to rename the file to attach it here. The original file name is internalPSHoldings.20161230.

SophiaF
Alteryx
Alteryx

Hi @nhunter,

 

Generally this error occurs because the field length on input is too short, causing truncation of the data. If you increase the Field Length in the input tool and read in the file with no delimiters (delimiter = \0), you should be able to read the entire file in. Then, you can use a Text to Columns tool to parse out on the pipe.

 

this_one.png

Sophia Fraticelli
Senior Solutions Architect
Alteryx, Inc.
SophiaF
Alteryx
Alteryx

For additional color, here's a great knowledge base article: Error reading “FILEPATH\MYFILE.txt”: Too many fields in record #

Sophia Fraticelli
Senior Solutions Architect
Alteryx, Inc.
nhunter
7 - Meteor

Hi, Sophia,

 

I appreciate your response but the KB article is not that helpful. Since I have more header fields than data fields, according to the article I should...

 

just get a warning like this:
'Warning: Input Data (1): Record #6: Not enough fields in record'
And the last record in the third field (sic--presumably he means the third field in the last record) will just be null.

But that's not what's happening. I would be fine with that. What is actually happening is that I get an error message and no data is loaded.

 

I tried loading the file without delimiters and then using the Text to Columns tool but then all the fields are named Field1, Field2, ... FieldN.

 

How would I recover the field names from the original header row?

RodL
Alteryx Alumni (Retired)

For your last question, use a Dynamic Rename and "Take Field Names from First Row"

MarqueeCrew
20 - Arcturus
20 - Arcturus

Hi @nhunter,

 

Helpful as my friends @RodL & @SophiaF were, I have a full answer for you.

 

YUCK!

 

In order to bring in your data cleanly, I did the following:

  1. Counted the pipes to determine 60 fields of data.
  2. Imported the file as a single field
    1. Set Length of field to 1024 bytes
    2. Used \0 as a delimiter
  3. Trimmed the spaces off of the data (I think that this is the root issue)
  4. Parsed the data on the pipe (|) delimiter
  5. Removed extra fields

Wrote this essay....  Excel is more space forgiving...

 

I am sending you the workflow in version 11.  I recommend upgrading to it if you haven't already.  I am sending it in version 10 too.

 

Cheers,

Mark

Alteryx ACE & Top Community Contributor

Chaos reigns within. Repent, reflect and restart. Order shall return.
Please Subscribe to my youTube channel.
nhunter
7 - Meteor

Thanks, Mark, much appreciated!

Labels