Missed the Q4 Fall Release Product Update? Watch the on-demand webinar for more info on the latest in Designer 24.2, Auto Insights Magic Reports, and more!
Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Duplicate data in multiple columns - how to resolve

binsell
8 - Asteroid

Hi All - I'm hoping someone can help me solve my issue. I want to identify if only 6 columns in my massive data set have all the same values in. I have attached a before and after so can hopefully understand what I want to return. After I identify which lines are duplicate, I then need to get rid of the line or multiple lines that are duplicates from my data but obviously keeping the lines that are not duplicate. What tool can I use to get rid of these as if I just filtered by 'yes' would still bring back the duplicate lines.  Any help would be much appreciated :) 

14 REPLIES 14
binsell
8 - Asteroid

Amazing as always! I think you have replied to everyone single one of my posts! You are the master! Thank you :) 

binsell
8 - Asteroid

@Raj Thank you for this, after we have identified which lines are duplicates, how do I then get rid of the duplicate line and just keep one line in my data. The formula provides me to identify which ones are duplicates, but I then want to filter them out and just keep one of the lines, if that makes sense! I was thinking about a unique tool but it didn't seem to work and lost some data. Thank you so much 

binsell
8 - Asteroid

Thank you @binuacs as always! So once I have identified which is the duplicate line, how do I get rid of the duplicate line only so keeping one line of the data? Thank you 

Raj
16 - Nebula

@binsell the output of summarize tool in my solution is all the unique values only.

binsell
8 - Asteroid

The summarise tool does but I then have to join the data back in to get all my other columns. Once I do this, I can then see all the duplicate rows. So from there, I then need to delete the ones that are duplicates. Hope that makes sense? Thank you 

Labels
Top Solution Authors