Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

How to find the largest value of a field and give that as a new field value

haya
8 - Asteroid

Hi , I am a new bee in Alteryx.I want to find largest value of a column and provide that value as a new column. For eg:- I have a column week no where in I have value's from 1 to 13 including blanks. I want to generate a column max week no as 13 and give this value 13 across all rows. How can I achieve that. Note:- I have 31 columns and 841808 rows. .I just want to take the largest value of week no column and give that value as max week no column for all rows. Regards, Haya

4 REPLIES 4
jdunkerley79
ACE Emeritus
ACE Emeritus

You will need to do this in 2 steps:

 

- Use a summarise tool to get Max Week Number

- Use an append fields tool to add the column to all rows

2017-10-04_12-31-56.jpg

 

Sample attached

 

haya
8 - Asteroid

Thanks a lot.It worked :)

Storm
9 - Comet

Note that depending upon your file size and needs, you might choose an option other than the default for append, which will throw an error if the volume exceeds a certain amount:

 

append_size.png

 

I will choose "allow all appends" as situations warrant.

Wiggot01
5 - Atom

A Summarize to an Append may be a terrible option, depending on the size of your data.

 

I just tried that method on half a million rows of incoming data, and would have had 8 million rows of output (half a million times 16 copies), but only because I hadn't noticed the 16 row append limit. If I had seen that option from the beginning and set it to "All" appends, I would have ended up with 250 billion rows of output.

 

In my Summarize, I group by VIN and look for the max value of Received_Date. Then, instead of using an Append tool, I switched to a JOIN on BOTH the VIN and Received_Date = Max_Received_Date. If you have a unique field, set up the workflow this way.

 

I started with 531,942 rows, and ended up with 527,690 uniques. There were 4,252 duplicate rows, where newer data on that VIN had come through. Since my data pool grows every week, I had to do this so that I'm only using the most recent third-party data to keep my system of record up to date.

Labels