This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Under the hood of Alteryx: tips, tricks and how-tos.
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!
In this article, we’re going to explore how we can leverage Alteryx to run analytics on fishing reports in Southern California. You’ll learn how to use web scraping to pull in data from 976 Tuna, macros to automate the cleansing process, spatial analytics to better understand where the fish are biting, reporting tools to visualize our analysis, and the Alteryx Server to automate this on a weekly basis.
(Insert Bodhi & Jonny Utah because it’s the Alteryx Community & THAT’S WHAT WE DO!)
If you’re an avid fisherman in the SoCal area, it’s likely you’ve been on 976-Tuna checking out the fish counts, reports, landings, and all the other necessary information before sinking some money into your next big offshore trip. OR if you enjoy being subject to witty banter and inevitable dad jokes for not catching anything, that’s fine too…
‘Well that’s why they call it fishing and not catching.’
For those of us who don’t want to get skunked, this is how the process works. 976-Tuna will update fishing reports daily and it keeps all historic data and fish counts. With that in mind, the first step is to set the start date to determine how many days of data you would like to read in. To account for multiple days’ worth of data, we set the macro by default to a batch macro. In the example attached, we just set it to dynamically read in yesterday’s counts, updating the day / month / year in the URL below (note that if you were to read in multiple days of data, you could perform forecasting and predictive analysis). Feeding the date into our macro will adjust the URL of the webpage we are scraping and push into the download tool within the macro.
Once we’ve extracted the data, it is time to clean our data up to make sense of the HTML. Thankfully, our friends on the Community have already created an HTML parse workflow which you can simply just paste into the workflow to gather our data into a nice clean format.
From there, it’s a matter of visualizing our results. Thanks to the Alteryx Street Geocoder, we’re able to take ~20 different landings and geocode them in order to visualize on a map. But wait there’s more…we use thematic charting to understand what areas are bringing in a lot of fish versus those that are not (hot / cold).
Do I head down to H&M Landing to fish Mexico Waters or hang tight at Davey’s Locker? Hmmm...
Finally, this report then becomes automated so that every Wednesday we get an email that tells us where they are catching, how many, and if they’re bringing in the good stuff - yellowtail, bluefin, yellowfin, dorado, etc. This, in turn, allows us to plan accordingly, prep for the sushi party, and decide whether or not we’re calling in sick to work on Friday! Just kidding, I would never do that. EVER.
I understand that this workflow just barely scratches the surface of the analytics that can be run on fishing. With that said, this is not just a blog article – this is a call to action! Leverage this workflow as a starting place and from there pull in the tides, moon phases, temperature, weather, etc. Blend it all together, run some time series forecasting and predictive analytics, and more importantly, let me know!
Who knows, maybe one day you might look like Alex!