Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.

Engine Works

Under the hood of Alteryx: tips, tricks and how-tos.
MichaelSu
Alteryx Alumni (Retired)

In this article, we’re going to explore how we can leverage Alteryx to run analytics on fishing reports in Southern California. You’ll learn how to use web scraping to pull in data from 976 Tuna, macros to automate the cleansing process, spatial analytics to better understand where the fish are biting, reporting tools to visualize our analysis, and the Alteryx Server to automate this on a weekly basis.

 

(Insert Bodhi & Jonny Utah because it’s the Alteryx Community & THAT’S WHAT WE DO!)(Insert Bodhi & Jonny Utah because it’s the Alteryx Community & THAT’S WHAT WE DO!)

If you’re an avid fisherman in the SoCal area, it’s likely you’ve been on 976-Tuna checking out the fish counts, reports, landings, and all the other necessary information before sinking some money into your next big offshore trip. OR if you enjoy being subject to witty banter and inevitable dad jokes for not catching anything, that’s fine too…

 

‘Well that’s why they call it fishing and not catching.’‘Well that’s why they call it fishing and not catching.’

For those of us who don’t want to get skunked, this is how the process works. 976-Tuna will update fishing reports daily and it keeps all historic data and fish counts. With that in mind, the first step is to set the start date to determine how many days of data you would like to read in. To account for multiple days’ worth of data, we set the macro by default to a batch macro. In the example attached, we just set it to dynamically read in yesterday’s counts, updating the day / month / year in the URL below (note that if you were to read in multiple days of data, you could perform forecasting and predictive analysis). Feeding the date into our macro will adjust the URL of the webpage we are scraping and push into the download tool within the macro.

 

https://www.976-tuna.com/counts?m=8&d=27&y=2019

 

clipboard_image_2.png

 

Once we’ve extracted the data, it is time to clean our data up to make sense of the HTML. Thankfully, our friends on the Community have already created an HTML parse workflow which you can simply just paste into the workflow to gather our data into a nice clean format.

 

clipboard_image_3.png

 

From there, it’s a matter of visualizing our results. Thanks to the Alteryx Street Geocoder, we’re able to take ~20 different landings and geocode them in order to visualize on a map. But wait there’s more…we use thematic charting to understand what areas are bringing in a lot of fish versus those that are not (hot / cold).

 

Do I head down to H&M Landing to fish Mexico Waters or hang tight at Davey’s Locker? Hmmm...Do I head down to H&M Landing to fish Mexico Waters or hang tight at Davey’s Locker? Hmmm...

Finally, this report then becomes automated so that every Wednesday we get an email that tells us where they are catching, how many, and if they’re bringing in the good stuff - yellowtail, bluefin, yellowfin, dorado, etc. This, in turn, allows us to plan accordingly, prep for the sushi party, and decide whether or not we’re calling in sick to work on Friday! Just kidding, I would never do that. EVER.

 

clipboard_image_5.png

 

I understand that this workflow just barely scratches the surface of the analytics that can be run on fishing. With that said, this is not just a blog article – this is a call to action! Leverage this workflow as a starting place and from there pull in the tides, moon phases, temperature, weather, etc. Blend it all together, run some time series forecasting and predictive analytics, and more importantly, let me know!

 

Who knows, maybe one day you might look like Alex!Who knows, maybe one day you might look like Alex!

 

Comments
marktdavis3
6 - Meteoroid

Very cool !

aiReggie
5 - Atom

Interesting web scraping demo and workflow … thanks for sharing!

Jazyk
6 - Meteoroid

Very cool!  Thanks for sharing

gautiergodard
13 - Pulsar

This is cool, love the web scrape - this was an interesting way to do the HTML parse. 

Looking forward to playing around and finding a way to leverage the Python tool to do something similar. 

MichaelSu
Alteryx Alumni (Retired)

@gautiergodard ,

 

Thanks for reading! Sounds great, please share if you do create a way via the Python tool. There's also a cool little nugget in there that @bbak helped with called dynamic text to columns. Interesting way of pulling out records listed in a single cell..

 

Thanks,

Mike

Traveltop
5 - Atom

Thanks for sharing