This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
We've recently made an accessibility improvement to the community and therefore posts without any content are no longer allowed. Please use the spoiler feature or add a short message in the message body in order to submit your weekly challenge.
My solution is like many others, with the exception of pre-summarizing the whale data by whale ID and year. My general rule is always to reduce rows and columns as early as possible in a workflow, and that definitely made a difference with a dataset of this size.
Avg run time without pre-summarizing = 3.7 s Avg run time with pre-summarizing = 2.7 s
I was also intrigued by @PhilipMannering and others' solutions using Find/Replace. I never would have thought to use that tool for this! On my laptop, it was a good bit slower, though (avg 5.9 s).