Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Select records at frequent intervals

JoRao
8 - Asteroid

Hello - I am trying to convert an excel file into a database file and the data looks like in the left image below. I have set up a workflow to select records for under each of the location (Camp Springs, Largo, etc) and then manipulate the data to make it look the way I want it to. The way I am doing it right now is per the image below: Is there a way for me to avoid repeating the select records process and use a tool (sort of like a macro) that can select records at specified intervals? Thanks a lot in advance. 

 

Best,

Jo

 

Sample Data.PNG

 

 Current Workflow.PNG

 

 

11 REPLIES 11
BenMoss
ACE Emeritus
ACE Emeritus

Hi!

 

You've posted a image of your sample data but if you could attach it as an xlsx file that would make it much easier to help you! It only has to be a small sample as in the image.

 

Yes there is definitely a way to make this dynamic, once we have data, I'm sure someone will get an answer to you quickly!

 

Ben

DavidP
17 - Castor
17 - Castor
A very basic way of doing it is to use the sample tool. If your records repeat every 19 rows, you can choose 1 of every N and set N to 19, so rows 1, 20, 39, etc will be selected. To select rows 2 21 40, etc. you use 2 sample tools: the first is set to skip the first 1 and the 2nd is set to 1 of every 19.

As the pattern is repeating, it lends itself well to a macro.
JoRao
8 - Asteroid

Sure @BenMoss. Find attached the sample file.

 

Thanks!

 

DavidP
17 - Castor
17 - Castor
A very basic way of doing it is to use the sample tool. If your records repeat every 19 rows, you can choose 1 of every N and set N to 19, so rows 1, 20, 39, etc will be hhselected. To select rows 2 21 40, etc. you use 2 sample tools: the first is set to skip the first 1 and the 2nd is set to 1 of every 19.

As the pattern is repeating, it lends itself well to a macro.
JoRao
8 - Asteroid

Thanks @DavidP. Would it work if I had to select records 4-21 and then repeat the select record process for 22-39 and so on?

BenMoss
ACE Emeritus
ACE Emeritus

But a quick example with the following data would be...

 

Field1
Camp Springs
HMO
POS
Capitol Hill
HMO
POS

 

Use the recordID tool, to give the following table...

 

Field1RecordID
Camp Springs1
HMO2
POS3
Capitol Hill4
HMO5
POS6

 

Use a formula tool to devide the recordID by the number of rows in each group, in this example, 3, then use the CEIL function to round this to a whole number...

 

 

CEIL([RecordID]/3)

Lets call this new field Group

 

 

Field1RecordIDGroup
Camp Springs11
HMO21
POS31
Capitol Hill42
HMO52
POS62

 

I will then use a sample tool to return my the first record per group. Before, as a seperate stream, using a sample tool to return everything but the first record per group.

 

Then simply use the join tool on the group field to make your data look something like...

 

CompanyField1
Camp SpringsHMO
Camp SpringsPOS
Capitol HillHMO
Capitol HillPOS

 

Example workflow attached...

 

Ben

 

JoRao
8 - Asteroid

Thanks @BenMoss. Since my company name is on a different column and product on a different column, would I have to first combine the two?  This is how my entire raw data looks like: Rawdata.PNG

And I have attached the formal of my sample output file.

 

Can I use your solution without combining the two columns? Sorry if my questions sound very ignorant!

 

Thanks!

 

 

BenMoss
ACE Emeritus
ACE Emeritus

Arrr, this it great, sorry I made an assumption with your output data and what the output should look like, now I have both I can help for sure!

 

This should work perfectly with your expanded dataset too, you just need to add the transpose after the filter.

 

Example workflow attached.

 

Ben

JoRao
8 - Asteroid

Can you send me your input file as well? Thank you so much!!

Labels