Engine Works

Under the hood of Alteryx: tips, tricks and how-tos.
13 - Pulsar
13 - Pulsar

If you didn’t catch Part 1 or Part 2 be sure to check them out.


Now that we know about the basics of analytic apps, the tools we will use in Alteryx, and some of the thought process that goes into app building, the next step is to look at a use case. To start we need to know:


   1) The data

   2) Some common interaction points in the data

   3) The business problem we are solving


Let’s start with the data:



(Retrieved from: https://ruthdobsontorres.files.wordpress.com/2014/03/showmethedatameme.jpg?w=584)


The dataset we will be using for the remainder of the series comes from Kaggle. If you aren’t familiar with Kaggle, it is a machine learning and data science community that is full of great datasets for use in mock situations. Some companies have used Kaggle to hold competitions in the space and have given money to the winners:







The first thing I did was take the 3 csv files at this link and use Alteryx to combine them into one data set we can leverage in the app. Once we have them in a single dataset (You will find the data set attached and can look through the data along with this blog post), we will want to learn about the dataset. Thankfully Alteryx has a few ways to do so:


1) Browse tool - from the Browse tool we can start to see a breakdown of the values in different fields once a workflow is ran with data.




You can also click on a field in the Configuration window and see a profile for that column and break down the values we could see for that column. It can also highlight any issues in the data.




This is a bit more manual in approach and would require you to take screenshots, jot down any notes, or just have an amazing memory (I have to write down everything just to remember to do it... which reminds me I need to mark down finishing the rest of this blog series 😉).


2) Data Investigation tools - the Data Investigation tools really allow you to set up what you want to look at and see not only what exists in your data from a value standpoint, but you can also start to dive into different correlations, create different tables, or create some cool plots of your data.




The first tool we want to look at in the Results panel mirrors the Browse tool features pretty well and that is the Basic Data Profile tool. I also want to look at the Field Summary tool, as this also gives a cool breakdown of the details of this dataset.


The Basic Data Profile gives a brief intro to each column and 21 specific stats related to each column in your set. Below you will see an example of the first column ([Store]) in the Combined Set yxdb file:




This breaks down both numeric and string values and corresponding stats and will let you know about some metadata associated with each column.


For the Field Summary tool, I am going to focus on the R (Report) output from this tool. Feel free to investigate the other two outputs (O is the data stream with descriptive statistics for selected columns, and I is the interactive dashboard), but the R output produces a static report analyzing string fields and gives some good insight into each value:




I have seen other results where I discovered in columns that only one value exists for each record, and the report suggests removing the column from my dataset. I have also seen other Remarks that suggest good ways to clean up, modify, or adjust a column to help with analysis.


3) Field Info tool for metadata - the last tool to keep in your back pocket when looking through your data is the Field Info tool which focuses more on metadata, but when dealing with apps the metadata is just as important, especially when you get into more complex actions for your apps.




Knowing your data type will also be helpful when designing your UI and how you may want to interact with your dataset. In some cases you may decide to clean up the data before your actions take place in the app (in this case maybe an Auto Field tool to correct our data types and sizes).


 Now that we know more about our dataset, the fun can begin!



(Retrieved from: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRnSvRcaCBcDSVHsKRfwgEG82wRU68OJv-mVg&usqp=CAU)


We can start to break down different use cases and see how potential apps could solve said problems. Looking at these data, there are several different use cases to look at:


  1. Allowing a user to define a range of gas prices and how that group compares to the rest of the group in regards to sales
  2. Create an app that allows a user to input a set day count pre-holiday to analyze sales trends
  3. Set a cap for unemployment rates and see how sales are impacted versus the two groups
  4. Create an app that could look at specific days of the week for sales and allow a user to include/exclude weekends and holidays


Make sure to come back for the hands-on app building! 



Banner image by lucatelles.