Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

File Length Exceeds Length Allowed By Manual Editor

EvolveKev
8 - Asteroid

Hi there!

 

I'm having trouble with importing data. The file is a .txt file, and is about 2.5gb in size. 

 

The data is fixed format, i.e. each row of data only has one column, and I need to use the manual editor to split the data into many, many columns (about 450!!!).

 

I'm running into the following error:

 

The file length exceeds length allowed by manual editor.

For best results browse to the description of the file (.flat) and specify the data file (.asc) in Option #5 of Input configuration.

 

I'm not quite sure how to get Alteryx to read the data. I've tried importing as a flat ASCII file and a text file, but I get the message above. I'm not really sure hot to change option #5 of the input menu as described (I'm pretty new to all this).

 

I've attached the data map so you can see what kind of data I'm dealing with. On this spreadsheet, I'm looking at the section titled Appraisal_Info.txt.

 

I did see a post similar to this, and it mentioned reading the file as a .csv and changing field lengths and then using the text to columns tool to parse the data. That might work as I can read the data and change the field length...but I can't use text to columns since the data contains no delimiters...

 

Anyone have any suggestions?

 

 

 

 

 

4 REPLIES 4
BenMoss
ACE Emeritus
ACE Emeritus

I would consider approaching this in a different way, which doesn't use manual mode, and involves a lookup table of sorts!

 

I've attached an example solution of how this could work.

 

Basically you create a table which specifies the field and the length (starting with the first). Given the size of your data this may not be ideal but worth a try!!

 

Ben

EvolveKev
8 - Asteroid

Hey Ben!

 

I totally appreciate the reply, sir.  This looks very interesting, and I'll see what I can do with it.  We might be "lucky" in a way as we don't require all fields - we actually only need 20 or so of the 450. So, with a bit of calculating, our lookup table could be pretty small!

 

Let me get my head around this, Ben - I'll keep you posted!

EvolveKev
8 - Asteroid

Ben, I just tested this with my data - it works like a charm. For the most part I can figure out what's going on in your workflow, but I'm going to do little googling for some of those functions.

 

This is brilliant!

EvolveKev
8 - Asteroid

Ooooof. 

 

So, we've hit a bit of a bump. You're solution, @BenMoss, works extremely well but the issue we've run into is the ridiculous size of the file.

 

Our data consists of 300,000 records, and each record consists of a single field that's about 9,000 characters in length. So, essentially by tokenizing on single characters, the number of rows grows to an insane amount and totally causes the system to run very slowly, i.e. we can't get this sucker to finish running!

 

But, the lookup table works perfectly, and I can't see any other way to work with this data - any other brainwaves?

Labels