This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Last weekend was the climax of nearly two months work as part of my Alteryx for Good project. This work has encompassed huge amounts of data preparation (particularly using open data that was recently released by the UK’s Department of the Environment, Food and Rural Affairs – DEFRA), spatial analytics and advanced analytics. All this hard work culmlinated in a 12-hour focussed data dive to show the charities involved the ‘art of the possible’ in terms of using such amazing data sources in combination with analytics.
Developing resource-intensive workflows can be a challenge when testing changes involves running the workflow through iterations that may take several minutes or more before being able to see results. This post walks through using the Cache Dataset macro to develop workflows in a smarter way, avoid repeated long run times, and speed up the process of blending and analyzing larger sets of data.