Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Can't see the option to "Start Data Import on Line #" in an input data

kimbria7
8 - Asteroid

New to Alteryx, so bear with me here

 

I added an Input Data item to my workflow to read in an xls file and want to skip the first and third rows, with the header in the second row. So to skip the first row, I was looking for the option/setting within Input Data for "Start Data Import on Line #" but I can't see it. This is what I see:

alteryx.PNG

I was going off this forum post since it's relatively new. Where can I find the setting?

If it matters, I'm running Alteryx Non-Admin Designer Version 2018.4.5.55178

3 REPLIES 3
Claje
14 - Magnetar

Hi,

 

This functionality exists for Microsoft Excel (.xlsx) files, but not for Excel 97-2003 (.xls) files.

If you need to read the file in as a 97-200 .xls file, a method you can use to get to the same output follows:

1) Check off "First Row Contains Data"

2) Add a "Select Records" tool, and configure it as follows:

2
4+

This will filter out rows 1 and 3.

3) Use a Dynamic Rename tool, and choose the Rename mode "Take Field Names from First Row of Data".  This will give you your header information (previously on line 2) as your new column headers.

 

Hope this helps!

kimbria7
8 - Asteroid

Perfect, that works! It doesn't seem as if Alteryx knows when to stop reading a file - I have just 20 records in my file so I had to make it 2-20 (changed to file to xlsx). Alteryx seems to think there's a lot more so the output of a conditional if statement goes on forever. Do I have to specify the row number when Alteryx should stop reading?

Claje
14 - Magnetar

When more rows are read in by Alteryx in an excel file that usually suggests that there is some kind of unusual excel formatting causing it to look like there is data further down the spreadsheet.

 

Your method will definitely work if you know there will never be more than 20 records in the file.  If there could be, I would recommend using a Filter and picking a field that you know will always have data in it, and doing something like:

!IsNull([Your Field])

That way you can handle a dynamic number of records while still only keeping valid data.

Labels