Free Trial

Past Analytics Excellence Awards

Suggest an idea

Author: Cesar Robles, Sales Intelligence Manager 

Company: Bavaria S.A.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve

In September 30th 2015, a gossip widespread through whatsapp reduces our Pony Malta sales to 40% of normal levels. The social networks’ gossip that impacts a brand destroys brand equity and creates distrust in our customers. Our company executes a 1st stage plan that helps to spread the gossip in the first weeks to more customers increasing the crisis. In Colombia no brand had suffered an attack like this before.

 

Describe the working solution

The Alteryx solution was develop to design and decision tree that define which customers has an relevant impact in sales volume in 5 groups that allows define differentiated protocols to recover our sales in a healthy way. These 5 groups were:

 

Citizens: Actual Customers without any impact related to social network crisis.
Refugees: Customers that reduce significantly (<50%) his rate of sales related to social network crisis.
Deportees: Customers that didn’t bought our brand related to social network crisis.
Pilgrims: Customers with doubts about our products related to social network crisis.
Aliens: New customers without any impact related to social network crisis.

 

Our gap in crisis was 180k PoS (Point of Sales) impacting 92 KHl (Kilo-hecto-liters)

 

This workflow runs monthly and uses multiple sources of information in SQL server related to Customer properties and historic sales levels. We report some results in Excel and Java applications to define our performance in recovery actions. Actually we are migrating to in database process to optimize the algorithm performance and use Tableau to manage our visualization process.

 

11.png

Figure 1. Decision Tree description

 

12.png

Figure 2. 1st Quarter Deportees results

 

13.png

Figure 3. 1st Quarter Refugees results

 

14.png

Figure 4. 1st Quarter Citizens results

 

15.png

Figure 5. Numerical Distribution Initial and End State

 

16.png

Figure 6. Blending Workflow

 

17.png

Figure 7. Decision Tree workflow

 

18.png

Figure 8. Hierarchy and Priority workflow

 

Describe the benefits you have achieved

The project defines a new way to customer segmentation in our company. We use the same algorithm to define not only crisis contingence, also we used to brand expansion and price control process including geographical variables and external info of our providers (Nielsen, YanHass, Millward Brown).

 

The solution had not been implemented before Alteryx. An estimated time saving show us that initial state needs 2 or 3 weeks to develop compared with 4 or 5 days that we used in Alteryx (We just used it 1 month ago in the firs solution). Right now our response time is less than 2 days in similar solutions.

 

In Business terms, we achieve to recover 100k PoS (approximately 25% of all Colombia Market) and increase our sales in 75% of normal levels in the first 3 months. In August 2016, we recover our normal levels of sales with the trade marketing actions focused support by Alteryx workflow.

Author: Jennifer Jensen, Sr. Analyst In-2CRev-28px-R.pngand team members Inna Meerovich, RJ Summers

Company: mcgarrybowen 

 

mcgarrybowen is a creative advertising agency that is in the transformation business. From the beginning, mcgarrybowen was built differently, on the simple premise that clients deserve better. So we built a company committed to delivering just that. A company that believes, with every fiber of its being, that it exists to serve clients, build brands, and grow businesses.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Mcgarrybowen creates hundreds of pieces of social creative per year for Fortune 500 CPG and Healthcare brands, on platforms including Facebook and Twitter. The social media landscape is constantly evolving especially with the introduction of video, a governing mobile-first mindset, and interactive ad units like carousels, but yet the capabilities for measuring performance on the platforms have not followed as closely.

 

Our clients constantly want to know, what creative is the most effective, drives the highest engagement rates, and the most efficient delivery? What time of day, day of week is best for posting content? What copy and creative works best? On other brands you manage, what learnings have you had?

 

But, therein lies the challenge. Answers to these questions aren’t readily available in the platforms, which export Post-Level data in raw spreadsheets with many tabs of information. Both Facebook and Twitter can only export 90 days of data at a time. So, to look at client performance over longer periods of time and compared to their respective categories, and derive performance insights that drive cyclical improvements in creative – we turned to Alteryx.  

 

Describe the working solution

Our Marketing Science team first created Alteryx workflows that blended multiple quarters and spreadsheet tabs of social data for each individual client. The goal was to take many files over several years that each contained many tabs of information, and organize it onto one single spreadsheet so that it was easily visualized and manipulated within Excel and Tableau for client-level understanding. In Alteryx, it is easy to filter out all of the unnecessary data in order to focus on the KPIs that will help drive the success of the campaigns.  We used “Post ID,” or each post’s unique identifying number, as a unifier for all of the data coming in from all tabs, so all data associated with a single Facebook post was organized onto a single row.  After all of the inputs, the data was then able to be exported onto a single tab within Excel.

 

After each client’s data was cleansed and placed into a single Excel file, another workflow was made that combined every client’s individual data export into a master file that contained all data for all brands.  From this, we can easily track performance over time, create client and vertical-specific benchmarks, and report on data efficiently and effectively.

 

Single Client Workflow

mcgarrybowen1.png

 

Multi-Client Workflow

mcgarrybowen2.png

 

Describe the benefits you have achieved

Without Alteryx, it would take countless hours to manually work with the social data in 90 day increments and manipulate the data within Excel to mimic what the Alteryx workflow export does in seconds. With all of the saved time, we are able to spend more time on the analysis of these social campaigns.  Since we are able to put more time into thoughtful analysis, client satisfaction with deeper learnings has grown exponentially.  Not only do we report out on past performance, but we can look toward the future and more real-time information to better analyze and optimize.

Author: Brett Herman ( @brett_hermann ) , Project Manager, Data Visualization In-2CRev-28px-R.png

Company: Cineplex

 

Cineplex Inc. (“Cineplex”) is one of Canada’s leading entertainment companies and operates one of the most modern and fully digitized motion picture theatre circuits in the world. A top-tier Canadian brand, Cineplex operates numerous businesses including theatrical exhibition, food service, amusement gaming, alternative programming (Cineplex Events), Cineplex Media, Cineplex Digital Media, and the online sale of home entertainment content through CineplexStore.com and on apps embedded in various electronic devices. Cineplex is also a joint venture partner in SCENE – Canada’s largest entertainment loyalty program. 

 

Awards Category: Most Time Saved

 

Describe the problem you needed to solve 

Incremental/Uplift Modelling is a popular method of evaluating the success of business initiatives at Cineplex. Its effectiveness at measuring the change in consumer behavior over time creates a high demand to produce this kind of analysis for various departments in the organization. Due to the large amount of requests we receive, the ‘Incremental Lift Model’ was developed to take in user-defined inputs, and output the results within a short period of time.

 

Describe the working solution

Our solution works through a four step process. The first step is for the client to populate the ‘study input form’ in order to define their study parameters and the type of study they want to run.

 

Visual 1: Study Input Form

Alteryx Analytics Excellence Awards 2016 2H - bhermann - Visual 1.jpg

 

The second step is to update/materialize our loyalty data that’s inputted into the model (yxdb format). We do this so that the model doesn’t put stress on our SQL Server databases, and to increase the model’s performance.

 

Visual 2: Update/Materialize Alteryx Input Data

Alteryx Analytics Excellence Awards 2016 2H - bhermann - Visual 2.jpg

 

The third step is the core of the incremental lift modelling. A macro processes one study at a time by pointing to the user defined inputs made in the first step.

 

Visual 3: Study Numbers are selected and passed through the incremental lift macro, and saves the output to SQL.

Alteryx Analytics Excellence Awards 2016 2H - bhermann - Visual 3.jpg

 

The data will then be passed through one of several macros depending on the study type, and filtered down based on the inputs defined by the user in the study input form. All data sources are joined together and lift calculations are made, which are then outputted into a master SQL Table ready to be visualized.

 

Visual 4: Incremental Lift Modelling based on study type selected.

Alteryx Analytics Excellence Awards 2016 2H - bhermann - Visual 4.jpg

 

The results are visualized using a Tableau Dashboard in order to share and communicate the results of the study back to the business clients.

 

Visual 5: Tableau Dashboard to explain to the business how the incremental lift model makes its calculations.

Alteryx Analytics Excellence Awards 2016 2H - bhermann - Visual 5.jpg

 

Alteryx Analytics Excellence Awards 2016 2H -bhermann - Visual 6.jpg

 

 

Describe the benefits you have achieved

The overarching goal of this project was twofold; to minimize the amount of work required to process business requests while maximizing the output generated, and to develop a means of delivering the results in a consistent manner. Both of these goals contribute greatly to our ROI by virtually eliminating all time spent executing requests that come in, and by minimizing time spent meeting with business users to explain how the incremental lift model works and how to interpret the results.

 

Author: Mandy Luo, Chief Actuary and Head of Data Analytics

Company: ReMark International

 

Awards Category: Best Use of Predictive

As a trained Statistician, I understand why "70% data, 30% model" is not an exaggeration. Therefore, before applying any regression models, I always make sure that input data are fully reviewed and understood. I use various data preparation tools to explore, filter, select, sample or join up data sources. I also utilize the data investigation tools to conduct or validate any statistical evaluation. Next, I would usually choose 3-5 predictive modeling candidates depending on the modeling objective and data size. I often include one machine learning methods in order to at least benchmark other models. After the modeling candidates finish running, I would select the best model based on both art (whether the coefficients look reasonable based on my understanding of the data and business) and science (statistical criteria's like the goodness of fit, P-value and cumulative lift etc.).  I am also often using the render function for model presentation and scoring/sorting  function for model validation and application.

 

Describe the problem you needed to solve 

ReMark is not only an early adopter in predictive modeling for life insurance, but also a true action taker on customer centricity by focusing on customer lifetime analytics (instead of focusing on 'buying' only). In this context, we need to 'join up' our predictive models on customer response, conversion and lapse in order to understand the most powerful predictors that drive customer activities across pre and post sales cycle. We believe the industry understand that it is insufficient to only focus on any single customer activity, but is still exploring how this can be improved through modeling and analytics, which is where we can add value.

 

Describe the working solution

Our working solution goes with the following steps:

  1. Match over one year post sales tracking data back to sales payment data and marketing data (all de-personalized)
  2. Build 3 predictive models: sale(whether the purchase is agreed or not), conversion (whether the first premium bill is paid or not), 1 year persistency (whether lapse happened at month 13 or not).
  3. Compare model results by key customer segments and profiles
  4. Expert to visualization tool (e.g. Tableau) to present results
  5. Model use test: scoring overlay and optimization strategy

 

Describe the benefits you have achieved

  • We could create customer segments' not just based on tendency to 'buy', but also tendency to 'pay' and 'stay'.
  • We could further demonstrate ReMark's analytics and modeling capabilities covering the whole customer lifetime value chain without missing out

Author: Sintyadi Thong ( @MizunashiSinayu ), Pre-Sales Consultant, Karen Kamira & Harry Yusuf

Company: Lightstream Analytics PTE Ltd.

 

LightStream Analytics focuses on performance management, big data, advanced analytics, and innovative visualization through cloud SaaS applications, mobile, and traditional on-site systems. LightStream Analytics is well-positioned to deliver the most advanced products and services by capitalizing on its significant regional presence in Singapore and Indonesia. The combined offices have over 60 employees with deep technical and senior business experience. The company leverages our existing technical support and R&D centers in Indonesia and China to develop solutions which disrupt customary methods of data analysis and give clients access to revolutionary tools for understanding their data.  LightStream Analytics has partnered with more than 100 multinational and local clients to integrate, structure, analyze, and visualize information to measure their business performance and drive enterprise value growth.

 

Awards Category: Most Time Saved

  

Describe the problem you needed to solve 

 

One of our clients tried to implement one of the most Business Intelligence solution to help them grow their business through another company (we can pretty much say our competitor). However, there is one thing which hinders their development of the BI. When usually most companies want to see the date of sales (on which date their agents perform sales), this company would like to see the other way around, they would like to see ‘on which dates their agents do not perform sales activity’. For them this is very important. The BI developers hit a dead-end on this thing, and therefore I came with Alteryx.

 

Describe the working solution

 The BI I previously mentioned is QlikView. Well Qlik can do it, and I can guarantee. But it involves heavy scripting and logic. With heavy scripting it will also mean it requires heavy resources to perform the run (will only be visible when running with low RAM). Alteryx on the other hand can do this easily using drag-and-drop and repeatable workflow, so I feed Alteryx with the actual sales data, perform several left joins, filter, and unique. Alteryx requires no scripting, and to be honest I am not even an IT guy, I know nothing about SQL and programming, but I can create this workflow easily. So we proposed to have Alteryx prepare and blend the data before feeding it to QlikView, therefore it will help to make the data visible before feeding it to QlikView and lessens the burden on QlikView. While the client has not yet confirmed whether they will get Alteryx or not, it is really satisfying and rewarding to solve this problem easily while others had hardships in getting this result.

 

Describe the benefits you have achieved

While I created the workflow in only an hour vs their 2 weeks development for this one case (and in which they failed after 2 weeks), this shows how much of a time savings the client would get if they developed QlikView alongside with Alteryx. Alteryx will help the customers to get results faster and perform an advanced ETL which might be hard to do in traditional SQL language.

Author: Rana Dalbah, Director - Workforce Intelligence & Processes

Company: BAE Systems

 

Awards Category: Most Unexpected Insight - Best Use Case for Alteryx in Human Resources

 

Working in Human Resources, people do not expect us to be technology savvy, let alone become technology leaders and host a "Technology Day" to show HR and other functions the type of technology that we are leveraging and how it has allowed us, as a team, to become more efficient and scalable.

 

Within the Workforce Intelligence team, a team responsible for HR metrics and analytics, we have been able to leverage Alteryx in a way that has allowed us to become more scalable and not "live in the data", spending the majority of our time formatting, cleansing, and re-indexing. For example, Alteryx replaced both Microsoft T-SQL and coding in R for our HR Dashboard, which allowed us to decrease the pre-processing time of our HR Dashboard from 8-16 hours per month to less than 10 minutes per month, which does not account for the elimination of human intervention and error.

 

With the time savings due to Alteryx, it has allowed us to create custom metrics in the dashboard at a faster rate to meet customer demands. In addition, it has also given us the opportunity to pursue other aspects of Alteryx forecast modeling, statistical analysis, predictive analytics, etc. The fact that we are able to turn an HR Dashboard around from one week to two days has been a game changer.

 

The HR dashboard is considered to have relevant data that is constantly being used for our Quarterly Business Reviews and has attracted the attention of our CEO and the Senior Leadership. Another use that we have found for Alteryx is to create a workflow for our Affirmative Action data processing. Our Affirmative Action process has lacked consistency over the years and has changed hands countless times, with no one person owning it for more than a year. After seeing the capabilities for our HR Dashboard, we decided to leverage Alteryx to create a workflow for our Affirmative Action processing that took 40 hours of work down to 7 minutes with an additional hour that allows for source data recognition 

recognition and correction.  We not only have been able to cut down a two or three month process to a few minutes, but we also now have a documented workflow that lists all the rules and exceptions for our process and would only need to be tweaked slightly as requirements change.

 

For our first foray into predictive analytics, we ran a flight risk model on a certain critical population.  Before Alteryx, the team used SPSS and R for the statistical analysis and created a Microsoft Access database to combine and process at least 30 data files.  The team was able to run the process, with predictive measures, in about 6 months.  After the purchase of Alteryx, the workflow was later created and refined in Alteryx, and we were able to run a small flight risks analysis on another subset of our population that took about a month with better visualizations than what R had to offer.  By reducing the data wrangling time, we are able to create models in a more timely fashion and the results are still relevant.

 

The biggest benefit of these time-savings is that it has freed up our analytics personnel to focus less on “data chores” and more on developing deeper analytics and making analytics more relevant to our executive leadership and our organization as a whole.  We’ve already become more proactive and more strategic now that we aren’t focusing our time on the data prep.  The combination of Alteryx with Tableau is transformative for our HR, Compensation, EEO-1, and Affirmative Action analytics.  Now that we are no longer spending countless hours prepping data, we’re assisting other areas, including Benefits, Ethics, Safety and Health, Facilities, and even our Production Engineering teams with ad-hoc analytics processing.

 

Describe the problem you needed to solve 

A few years ago, HR metrics was a somewhat foreign concept for our Senior Leadership. We could barely get consensus on the definition of headcount and attrition.  But in order for HR to bring to the table what Finance and Business Development do: metrics, data, measurements, etc. we needed to find a way to start displaying relevant HR metrics that can steer Senior Leadership in the right direction when making decisions for the workforce.  So, even though we launched with an HR Dashboard in January of 2014, it was simple and met minimum requirements, but it was a start. We used Adobe, Apache code and SharePoint, along with data in excel files, to create simple metrics and visuals. In April 2015, we launched the HR Dashboard using Tableau with the help of a third party that used Microsoft SQL server to read the data and visualize it based on our requirements. However, this was not the best solution for us because we were not able to make dynamic changes to the dashboard in a fast timeframe. The dashboard was being released about two weeks after fiscal month end, which is an eternity in terms of relevance to our Senior Leadership.  

 

Once we had the talent in-house, we were able to leverage our technological expertise in Tableau and then, with the introduction of Alteryx, create our workflows that cut down a 2 week process into a few days - including data validation and dashboard distribution to the HR Business Partners and Senior Leadership.  But why stop there?  We viewed Alteryx as a way to help refine existing manual processes: marrying multiple excel files using vlookups, pivot tables, etc. that were not necessarily documented by the users and cut back on processing time. If we can build it once and spend minimal time maintaining the workflow, why not build it?  This way, all one has to do in the future is append or replace a file and hit the start button, and the output is created.  Easy peasy! That is when we decided we can leverage this tool for our compliance team and build out the Affirmative Action process, described above, along with the EE0-1 and Vets processing.

 

What took months and multiple resources now takes minutes and only one resource.

 

Describe the working solution

The majority of the data we are using comes from our HCM (Human Capital Management Database) in excel based files. In addition to the HCM files, we are also using files from our applicant tracking system (ATS), IMPACT Awards data, Benefit provider, 401K, Safety and Health data, and pension providers.

 

Anything that does not come out of our HCM system are coming from a third party vendor. These files are used specifically for our HR dashboard, Affirmative Action Plan workflow, Safety & Health Dashboard, and our benefits dashboard.

 

In addition to dashboards, we have been able to leverage the mentioned files along with survey data and macro-economic factors for our flight risk model. We have also leveraged Google map data to calculate the commute time from an employee's home zip code to their work location zip code. This was a more accurate measurement of time spent on the road to and from work when compared to distance.

 

The ultimate outputs vary: an HR dashboard that tracks metrics such as demographics, headcount, attrition, employee churn/movement, rewards and exit surveys is published as a Tableau workbook. The Flight Risk analysis that allows us to determine what factors most contribute to certain populations leaving the company; a compensation dashboard that provided executives a quick way to do merit and Incentive Compensation planning includes base pay, pay ratios, etc. is also published as a Tableau Workbook.

 

This workflow has as its input our employee roster file, which includes the employee’s work location and supervisor identifiers and work locations going up to their fourth level supervisor.  For the first step of processing, we used stacked-joins to establish employee’s supervisor hierarchies up to the 8th level supervisor.  We then needed to assign initial “starting location” for an employee based on the location type.  That meant “rolling up” the employee’s location until we hit an actual company, not client, site.  We did this because Affirmative Action reporting requires using actual company sites.  The roll-up was accomplished using nested filters, which is easier to see, understand, modify, and track than a large ELSEIF function (important for team sharing). 

 

Once the initial location rollup was completed, we then needed to rollup employees until every employee was at a site with at least 40 employees.  While simply rolling all employees up at once would be quick, it would also result in fewer locations and many employees being rolled up too far from their current site which would undermine the validity and effectiveness of our Affirmative Action plan.  Instead, we used a slow-rolling aggregate sort technique, where lone employees are rolled up into groups of two, then groups of two are rolled up into larger groups, and so on until sites are determined with a minimum of 40 employees (or whatever number is input).  The goal is to aggregate employees effectively, while minimizing the “distance” of the employee from their initial site.  This sorting was accomplished using custom-built macros with a group size control input that can be quickly changed by anyone using the workflow.

 

The end result was the roster of employees with the original data, with new fields identifying their roll-up location, and what level of roll-up from their initial location was needed.  A small offshoot of “error” population (usually due to missing or incorrect data) is put into a separate file for later iterative correction.

 

Previously, this process was done through trial and error via Access, and Excel.  That process, was not only much slower and more painstaking, but it also tended to result in larger “distances” of employees from initial sites then was necessary.  As a result, our new process is quicker, less error-prone, and arguably more defensible than its predecessor.

 

image001.png

 

One of the Macros used in AAP:

 

image002.png

 

Describe the benefits you have achieved

Alteryx has enabled our relatively small analytics shop (3 people) to build a powerful, flexible and scalable analytics infrastructure without working through our IT support.  We are independent and thus can reply to the user's custom requests in a timely fashion.  We are seen as agile and responsive - creating forecasting workflows in a few days to preview to our CEO and CHRO instead of creating Power Point slides to preview for them a concept.  This way, we can show them what we expect it to look like and how it will work and any feedback they give us, we can work at the same time to meet their requirements.  The possibilities of Alteryx, in our eyes, are endless and for a minimal investment, we are constantly "wowing" our customers with the service and products we are providing them.  In the end, we have been successful in showing that HR can leverage the latest technologies to become more responsive to business needs without the need for IT or developer involvement.

Author: Katie Snyder, Marketing Analyst

Company: SIGMA Marketing Insights

 

Awards Category: Most Time Saved

 

We've taken a wholly manual process that took 2 hours per campaign and required a database developer, to a process that takes five minutes per campaign, and can be done by an account coordinator. This frees our database developers to work on other projects, and drastically reduces time from data receipt to report generation.

 

Describe the problem you needed to solve 

We process activity files for hundreds of email campaigns for one client alone. The files come in from a number of different external vendors, are never in the same format with the same field names, and never include consistent activity types (bounces or opt-outs might be missing from one campaign, but present in another). We needed an easy, user-friendly way for these files to be loaded in a consistent manner. We also needed to add some campaign ID fields that the end user wouldn't necessarily know - they would only know the campaign name.

 

Describe the working solution

Using interface tools, we created an analytic app that allowed maximum flexibility in this file processing. Using a database query and interface tools, Alteryx displays a list of campaign names that the end user selects. The accompanying campaign ID fields are passed downstream. For each activity type (sent, delivered, bounce, etc), the end user selects a file, and then a drop down will display the names of all fields in the file, allowing the user to designate which field is email, which is ID, etc. Because we don't receive each type of activity every time, detours are placed to allow the analytic app user to check a box indicating a file is not present, and the workflow runs without requiring that data source.

 

Picture1.png

 

Picture2.png

 

All in all, up to six separate Excel or CSV files are combined together with information already existing in a database, and a production table is created to store the information. The app also generates a QC report that includes counts, campaign information, and row samples that is sent to the account manager. This increases accountability and oversight, and ensures all members of the team are kept informed of campaign processing.

 

Process Opt Out File - With Detour:

 

Picture3.png

 

Join All Files, Suppress Duplicates, Insert to Tables:

 

Picture4.png

 

Generate QC Report:

 

Picture5.png

 

Workflow Overview:

 

Picture6.png

 

QC Report Example:

 

Picture7.png

 

Describe the benefits you have achieved

In making this process quicker and easier to access, we save almost two hours of database developer time per campaign, which accounts for at least 100 hours over the course of the year. The app can be used by account support staff who don't have coding knowledge or even account staff of different accounts without any client specific knowledge, also saving resources. Furthermore, the app can be easily adapted for other clients, increasing time savings across our organization. Our developers are able to spend time doing far more complex work rather than routine coding, and because the process is automated, saves any potential rework time that would occur from coding mistakes. And the client is thrilled because it takes us less time to generate campaign reporting.

Author: Alberto Guisande (@Aguisande), Services Director

 

Awards Category: Most Unexpected Insight - Proving teachers wrong - Apples & Oranges can be compared! (thanks to Alteryx)

  

Describe the problem you needed to solve 

Our customer is a Public Transportation company, in charge of buses going around the city of Panama. They transport more than 500K passengers a day (1/6 of the total population of the country). Almost 400 routes, with 1,400 buses going around the city all days, working 24/7, reporting position every a few seconds. The company is supporting its operation with a variety of tools, but at the time to put all data together, they realized there was no "point of contact" in the data. They have to compare apples & oranges! Really? Why does the saying exist? Because you can't! So we started trying to do the impossible!

 

BTW, the business questions are pretty simple (once you got the data!): What route was every bus in, when every transaction occurred? What is the demand of every route? and for every stop?

 

Describe the working solution

Working with Alteryx, we were able to analyze data coming from three different sources, where the only common information was some LATITUDE & LONGITUDE (taken with different equipment, so the accuracy was, at least, questionable) at some random points in time. The data was received in several files:

 

  • Routes: Contains the ID & the name of every route. Stop Points: Containing every bus stop, its LAT & LONG, and the stop name
  • Pattern Detail: Containing every route, its stops and the sequence of those stops in a route
  • Some remarks: A lot of stops are used by different routes, and there are some stops, where the bus pass through, that are not part of the specific route the bus is at

 

So far, the easy part! We managed very easily to get all this info together. Now the tricky part: There mainly two operational datasets: AVL (Every position of every bus, every n seconds, where n is an arbitrary number between 0 and what the piece of hardware wanted to use). BTW, a huge amount of data every day.

 

Transactions: transactions registered in time, in a bus. As you may infer, there are no data in common that allow us to match records beside an arbitrary range of latitude and longitude in some random time ranges. Because of how everything is reported, the bus may be passing in front a stop that is part of another route, or stopping far from the designated stop.

 

Describe the benefits you have achieved

With this solution, the company can start analyzing activity per route, demand per bus, route, stop, etc. Without Alteryx, this customer information still be looking like apples and oranges! We were able to make it sense and allow them to use it to get insights.

 

Colorful note(and some ego elevator) : 5 other vendors took the challenge. No other one could reach a glimpse of solution (of course, "no Alteryx, no gain").

 

Processpng.png

Process2.png

Process3.png

Author: Scott Elliott (@scott_elliott) , Senior Consultant

Company: Webranz Ltd

 

Awards Category: Best Use of Alteryx Server

 

We are using the server to store Alteryx Apps that get called by the "service bus" and perform calculations and write the results into a warehouse where growers can log into a web portal and check the results of the sample.

 

Describe the problem you needed to solve 

Agfirst BOP is a agricultural testing laboratory business  that perform scientific measurement on Kiwifruit samples it receives from 2500 growers around New Zealand. In peak season it tests up to 1000 samples of 90 fruit per day. The sample test results trigger picking of the crop, cool storage, shipping and sales to foreign markets. From the test laboratory the grower receives notification of the sample testing being completed. They log into a portal to check the results. Agfirst BOP were looking for a new technology to transform the results from the service bus up to the web portal which gave them agility around modifying or adding tests.

 

Describe the working solution

We take sample measurement results from capture  devices. These get shipped to a landing warehouse. There is a trigger which calls the Alteryx Application residing on the Alteryx server for each sample and test type.  The Alteryx App then performs a series of calculations and publishes the results into the results warehouse. The grower is now able to login to the web portal and check their sample. Each App contains multiple batch macros which allow processing sample by sample. Some of the tests have a requirement for the use of advanced analytics. These tests call R as part of the App.  The use of macros is great as it provide amazing flexibilty and agility to plug in or plug out new tests or calculations. Having it on Alteryx Server allows it to be enterprise class by giving it the ability to be scaled and flexible at the same time. As well as being fully supported by the infrastructure team as it is managed within the data centre rather than on a local desktop.

 

App:

 

Agfirst APP.jpg

 

Batch Macro:

 

Agfirst Batch Macro.jpg

 

Describe the benefits you have acheived

The benefits realised include greater agility around adding/removing sample tests via the use of Macros. We are able to performed advanced analytics by calling R and it futures proofs the business by enabling them to choose any number of vendors and not be limited by the technology because of the ability of Alteryx to blend multiple sources. It gives them amazing flexibility around future technology choices and it is all supported and backed up by the infrastructure team because it sits within the datacentre and they have great comfort in knowing it's not something sitting under someones desk.

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

Audit Automation.PNG

 

Billing:

 

Billing.PNG

 

File Generator:

 

File Generator.PNG

Market Generator:

 

Market Data.PNG

 

Parse:

Parse.PNG

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Alex Huang, Asst. Mgr, Quality Planning & Analysis

Company: Hyundai Motor America

 

Awards Category: Most Time Saved

 

There have been just a few times where some tool or platform has truly "changed" my life.  The two that come immediately to mind are Alteryx & Tableau.  Before I had either, the majority of my time was spent wrangling data, creating reports, and doing what I could using SAS, SQL, & Excel.  I had streamlined as much as I could and still felt bogged down by the rudimentary data tasks that plague many of us. 

 

With the power of Alteryx alone, I've regained 1,253 hours per year.  Alteryx WITH Tableau has saved me an additional 511 hours to a total of 1,764 hours saved per year!  Does that mean I can quit?  Maybe…but I’m not done yet!

 

For those that care for the details, here's a table of time savings I had cataloged during the start of my Alteryx journey.  I’ve had to blank out the activity names for security reasons but the time savings are real.

 

Picture1.png

 

I experienced a 71% savings in time with Alteryx alone!

 

With this new found "free time," I was able to prototype ideas stuck on my To-Do list and create new insight for my business unit.  Now my "what if's" go from idea straight to Alteryx (and to Tableau faster) and I couldn't be happier.  Insights are delivered faster than ever and with more frequent (daily) updates thanks to Alteryx Desktop Automation.

 

Describe the problem you needed to solve

Hyundai Motor America sells thousands of cars per day so the faster we can identify a quality issue and fix it, the more satisfied our customers will be.  Addressing quality concerns earlier and faster helps us avoid additional costs but most importantly brand loyalty, perceived quality, and vehicle dependability, etc.  Some examples of actions:

 

  1. Increased the speed at which we validate and investigate problems from survey data resulting in faster campaign launches and remedy development.
  2. Able to digest and understand syndicated data from J.D. Powers within hours instead of weeks allowing us to further validate the effectiveness of our prior quality improvement initiatives and also identify issues we missed.
  3. Being able to blend all the data sources we need (call center, survey data, repair data, etc.) in Alteryx allowed us to more rapidly prototype our customer risk models vs. traditional methods via SAS which took much longer.
  4. Alteryx automation with Tableau allowed us to deploy insight rich interactive dashboards that enabled management to respond to questions in real-time during our monthly quality report involving many major stakeholders throughout Hyundai.  This lead to more productive meetings with more meaningful follow-up action items.

 

I needed to solve a time problem first!  I was spending too much time doing things like data prep and reporting that just wasn’t quite enough for me.  I didn't have enough time to do what I really wanted to do, solve problems!

 

Being an avid fan/user of Tableau, data preparation started becoming my biggest challenge as my dashboard library grew.  I would end up writing monster SQL statements and scripts to get the data ready but I still struggled with automation for creating Tableau Data Extracts (TDE's). I explored using Python to create them but it just wasn't quite the "desired" experience.  Enter Alteryx, life changed.

 

Picture2.png

 

Describe the working solution

My work typically involves blending data from our transactional data warehouse, call center data, survey data, and blending third-party data from companies like J.D. Powers.  Since we have an Oracle database in-house, I'm able to leverage the In-DB tools in Alteryx which is just amazing!  In-DB tools are similar to a "visual query builder" but with the Alteryx look, feel, and added capability of Dynamic Input and Macro Inputs.  Since data only moves out of the DB when you want it to, queries are lightning fast which enable accelerated prototyping ability!

 

Describe the benefits you have achieved

I've quite literally freed up 93% of my time (given 1960 work hours per year with 15 days of vacation @ 8 hours per day) and started a new "data team" within my business unit with Alteryx & Tableau at its core.  The ultimate goal will be to replicate my time savings for everyone and “free the data” through self-service apps.  At this point, I’ve deployed 5,774 Alteryx nodes using 61 unique tools in 76 workflows of which 24% or so are scheduled and running automatically.  Phew!  Props to the built-in “Batch Macro Module Example” for allowing me to calculate this easily!

 

Picture3.png

 

We are able to identify customer pain points through an automated Alteryx workflow and algorithm that gauges how likely an issue will persist across all owners of the same model/trim package.  We’ve seen how blending Experian ConsumerView data bolsters this model but we’re still in the cost justification phase for that.  Upon detection of said pain point, we are able to trigger alerts and treatments across the wider population to mitigate the impact of this pain point.  Issues that can’t be readily fixed per se are relayed back to R&D for further investigation.  Ultimately customers may never see an issue because we’ve addressed it or they are simply delighted by how fast we’ve responded even when no immediate remedy is available.

 

The true bottom line is that the speed and accuracy at which we execute is critical in our business.  Customers want to be heard and they want to know how we are going to help resolve their problems now, not months later.  They want to love their Hyundai’s and the more they feel like we are helping them achieve that, the more loyal they will be to our brand.

 

Although we can’t fix everything, Alteryx helps us get to where we need to be faster which; in my opinion, is an enabler for success.

Author: Omid Madadi, Developer

Company: Southwest Airlines Co.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Fuel consumption expense is a major challenge for the airline industry. According to the International Air Transport Association, fuel represented 27% of the total operating costs for major airlines in 2015. For this reason, most airlines attempt to improve their operational efficiency in order to stay competitive and increase revenue. One way to improve operational efficiency is to increase the accuracy of fuel consumption forecasting.

 

Currently, Southwest Airlines offers services in 97 destinations with an average of 3,500 flights a day. Not having enough fuel at an airport is extremely costly and may result in disrupting flights. Conversely, ordering more fuel than what an airport needs results in high inventory and storage costs. As such, the objective of this project was to develop proper forecasting models and methods for each of these 97 airports in order to increase the accuracy and speed of fuel consumption by using historical monthly consumption data.

 

Describe the working solution

Data utilized in this project were from historical Southwest Airlines monthly fuel consumption reports. Datasets were gathered from each of the 97 airports as well as various Southwest departments, such as finance and network planning. Forecasting was performed on four different categories: scheduled flights consumption, non-scheduled flights consumption, alternate fuel, and tankering fuel. Ultimately, the total consumption for each airport was obtained by aggregating these four categories. Since data were monthly, time series forecasting and statistical models - such as autoregressive integrated moving average (ARIMA), time series linear and non-linear regression, and exponential smoothing - were used to predict future consumptions based on previously observed consumptions. To select the best forecasting model, an algorithm was developed to compare various statistical model accuracies. This selects a statistical model that is best fit for each category and each airport. Ultimately, this model will be used every month by the Southwest Airlines Fuel Department.

 

Capture3.PNG

 

In addition to developing a consumption forecast that increases fuel efficiency, a web application was also developed. This web application enables the Fuel Department to browse input data files, upload them, and then run the application in an easy, efficient, and effortless manner. Data visualization tools were also added to provide the Fuel Department with better insights of trends and seasonality. Development of the statistical models has been finalized and will be pushed to production for use by the Southwest Airlines Fuel Department soon.

 

Capture4.PNG

 

Capture6.PNG

 

Describe the benefits you have achieved

Initially, the forecasting process for all 97 Southwest Airlines airports used to be conducted through approximately 150 Excel spreadsheets. However, this was an extremely difficult, time-consuming, and disorganized process. Normally, consumption forecasts would take up to three days and would have to be performed manually. Furthermore, accuracy was unsatisfactory since Excel's capabilities are inadequate in terms of statistical and mathematical modeling.

 

For these reasons, a decision was made to use CRAN R and Alteryx for data processing and development of the forecasting models. Alteryx offers many benefits since it allows executing R language script by using R-Tool. Moreover, Alteryx makes data preparations, manipulations, processing, and analysis fast and efficient for large datasets. Multiple data sources and various data types have been used in the design workflow. Nonetheless, Alteryx made it convenient to select and filter input data, as well as join data from multiple tables and file types. In addition, the Fuel Department needed a web application that would allow multiple users to run the consumption forecast without the help of any developers, and Alteryx was a simple solution to the Fuel Department's needs since it developed an interface and published the design workflow to a web application (through the Southwest Airlines' gallery).

 

In general, the benefits of the consumption forecast include (but are not limited to) the following:

 

  • The forecasting accuracy improved approximately 70% for non-schedule flights and 12% for scheduled flight, which results in considerable fuel cost saving for the Southwest Airlines.
  • The current execution time reduced dramatically from 3 days to 10 minutes. Developers are working to reduce this time even more.
  • The consumption forecast provides a 12-month forecasting horizon for the Fuel Department. Due to the complexity of the process, this could not be conducted previously using Excel spreadsheets.
  • The Fuel Department is able to identify seasonality and estimate trends at each airport. This provides invaluable insights for decision-makers on the fuel consumption at each airport.
  • The consumption forecast identifies and flags outliers and problematic airports and enables decision-makers to be prepared against unexpected conditions.

Author: Qin Lee, Business Analyst

Company: MSXI

 

Awards Category: Most Unexpected Insight

 

Huge data, large file and multiple applications have been created and saved and shared in a small size of Alteryx file. And now, I can test the script/coding and find the errors. This is the good way to develop the proof of concept for our company.

 

Describe the problem you needed to solve 

We need to go through many applications to get the data and save into one location to share and view.

 

Describe the working solution

We are blending the data sources form SQL, Access Excel and Hadoop, Yes, we are leveraging many parties' data. We are developing the workflows and functions for a concept now. Yes, we are exporting to a visualization tool.

 

image002.jpg

 

image004.jpg

 

Describe the benefits you have achieved

Collected the data from many locations and saved into a small size of the Alteryx database file and created the workflow and function and developed a search engine and design the proof of concept for approval and launch. Saved time and resolved the problem and increased customer satisfaction. I would like to send my sincere thanks to Mr. Mark Frisch (@MarqueeCrew), who helped us for many days to finish this project.

Author: Shelley Browning, Data Analyst

Company: Intermountain Healthcare

 

Awards Category: Most Time Saved

  

Describe the problem you needed to solve 

Intermountain Healthcare is a not-for-profit health system based in Salt Lake City, Utah, with 22 hospitals, a broad range of clinics and services, about 1,400 employed primary care and secondary care physicians at more than 185 clinics in the Intermountain Medical Group, and health insurance plans from SelectHealth. The entire system has over 30,000 employees. This project was proposed and completed by members of the Enterprise HR Employee Analytics team who provide analytic services to the various entities within the organization.

 

The initial goal was to create a data product utilizing data visualization software. The Workforce Optimization Dashboard and Scorecard is to be used throughout the organization by employees with direct reports. The dashboard provides a view of over 100 human resource metrics on activities related to attracting, engaging, and retaining employees at all levels of the organization. Some of the features in the dashboard include: drilldown to various levels of the organization, key performance indicators (KPI) to show change, options for various time periods, benchmark comparison with third party data, and links to additional resources such as detail reports. Prior to completion of this project, the data was available to limited users in at least 14 different reports and dashboards making it difficult and time consuming to get a complete view of workforce metrics.

 

During initial design and prototyping it was discovered that in order to meet the design requirements and maintain performance within the final visualization it would be necessary for all the data to be in a single data set. The data for human resources is stored in 17 different tables in an Oracle data warehouse. The benchmark data is provided by a third party. At the time of development the visualization software did not support UNION or UNION ALL in the custom SQL function. During development the iterative process of writing SQL, creating an extract file, and creating and modifying calculations in the visualization was very laborious. Much iteration was necessary to determine the correct format of data for the visualization.

 

Other challenges occurred, such as when it was discovered that the visualization software does not support dynamic field formatting. The data values are reported in formats of percent, currency, decimal and numeric all within the same data column. While the dashboard was in final review it was determined that a summary of the KPI indicators would be another useful visualization on the dashboard. The KPI indicators, red and green arrows, were using table calculations. It is not possible to create additional calculations based on the results of table calculations in the visualization software. The business users also requested another cross tabular view of the same data showing multiple time periods.

 

Describe the working solution

Alteryx was instrumental in the designing and development of the visualization for the workforce dashboard. Without Alteryx the time to complete this project would have easily doubled. By using Alteryx, a single analyst was able to iterate through design and development of both the data set and the dashboard.

 

1.png

 

The final dashboard includes both tabular and graphic visualizations all displayed from the same data set. The Alteryx workflow uses 19 individual Input Data tools to retrieve data from the 17 tables in Oracle and unions this data into the single data set. Excel spreadsheets are the source for joining the third party benchmark data to the existing data. The extract is output from Alteryx directly to a Tableau Server. By utilizing a single set of data, filtering and rendering in visualization are very performant on 11 million rows of data. (Development included testing data sets of over 100 million rows with acceptable but slower performance. The project was scaled back until such a time as Alteryx Server is available for use.)

 

2.png

 

3.png

 

4.png

 

5.png

 

Describe the benefits you have achieved

The initial reason for using Alteryx was the ability to perform a UNION ALL on the 19 input queries. By selecting the option to cache queries, output directly to tde files, and work iteratively to determine the best format for the data in order to meet design requirements and provide for the best performance for filtering and rendering in the visualization, months of development time was saved. The 19 data inputs contain over 7000 lines of SQL code combined. Storing this code in Alteryx provides for improved reproducibility and documentation. During the later stages of the project it was fairly straight forward to use the various tools in Alteryx to transform the data to support the additional request for a cross tab view and also to recreate the table calculations to mimic the calculations the visualization. Without Alteryx it would have taken a significant amount of time to recreate these calculations in SQL and re-write the initial input queries.

 

Our customers are now able to view their Workforce Optimization metrics in a single location. They can now visualize a scenario in which their premium pay has been increasing the last few pay periods and see that this may be attributed to higher turnover rates with longer times to fill for open positions, all within a single visualization. With just a few clicks our leaders can compare their workforce optimization metrics with other hospitals in our organization or against national benchmarks.  Reporting this combination of metrics had not been attempted prior to this time and would not have been possible at this cost without the use of Alteryx.

 

Costs saving are estimated at $25,000 to-date with additional savings expected in future development and enhancements.

Author: Michael Barone, Data Scientist

Company: Paychex Inc.

 

Awards Category: Most Time Saved

 

We currently have more than two dozen predictive models, pulling data of all shapes and sizes from many different sources.  Total processing time for a round of scoring takes 4 hours.  Before Alteryx, we had a dozen models, and processing took around 96 hours.  That's a 2x increase in our model portfolio, but a 24x decrease in processing time.

 

Describe the problem you needed to solve 

Our Predictive Modeling group, which began in the early-to-mid 2000s, had grown from one person to four people by summer 2012.  I was one of those four.  Our Portfolio had grown from one model, to more than a dozen.  We were what you might call a self-starting group.  While we had the blessing of upper Management, we were small and independent, doing all research, development, and analysis ourselves.  We started with using the typical every day Enterprise solutions for software.  While those solutions worked well for a few years, by the time we were up to a dozen models, we had outgrown those solutions.  A typical round of "model scoring" which we did at the beginning of ever y month, took about two-and-a-half weeks, and ninety-five percent of that was system processing time which consisted of cleansing, blending, and transforming the data from varying sources.

 

Describe the working solution

We blend data from our internal databases - everything from Excel and Access, to Oracle, SQL Server, and Netezza.  Several models include data from 3rd party sources such as D&B, and the Experian CAPE file we get with out Alteryx data package.

 

Describe the benefits you have achieved

We recently have taken on projects that require us processing and analyzing billions of records of data.  Thanks to Alteryx and more specifically the Calgary format, most of our time is spent analyzing the data, not pulling, blending, and processing.  This leads to faster delivery time of results, and faster business insight.

Author: Andy Kriebel (@VizWizBI), Head Coach

Company: The Information Lab

 

Awards Category: Best 'Alteryx for Good' Story

 

The Connect2Help 211 team outlined their requirements, including review the database structure and what they were looking for as outputs. Note that this was also the week that we introduced Data School to Alteryx. We knew that the team could use Alteryx to prepare, cleanse and analyse the data. Ultimately, the team wanted to create a workflow in Alteryx that Connect2Help 211 could use in the future.

 

Ann Hartman, Director of Connect2Help 211 summarized the impact best: "We were absolutely blown away by your presentation today. This is proof that a small group of dedicated people working together can change an entire community. With the Alteryx workflow and Tableau workbooks you created, we can show the community what is needed where, and how people can help in their communities."

 

The entire details of the project can be best found here - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the problem you needed to solve 

In July 2015, Connect2Help 211, an Indianapolis-based non-profit service that facilitates connections between people who need human services and those who provide them, reached out to the Tableau Zen Masters as part of a broader effort that the Zens participate in for the Tableau Foundation. Their goals and needs were simple: Create an ETL process that extracts Refer data, transforms it, and loads it into a MYSQL database that can be connected to Tableau.

 

Describe the working solution

Alteryx-Workflow-211.png

 

See the workflow and further details in the blog post - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the benefits you have achieved

While the workflow looks amazingly complex, it absolutely accomplished the goal of creating a reusable ETL workflow. Ben Moss kicked off the project presentations by taking the Connect2Help 211 team through what the team had to do and how Connect2Help 211 could use this workflow going forward.

 

From there, the team went through the eight different visualisation that they created in Tableau. Keep in mind, Connect2Help 211 wasn't expecting any visualisations as part of the output, so to say they were excited with what the team created in just a week is a massive understatement.

 

Anuka.png

Author: Michael Peterman, CEO In-2CRev-28px-R.png

Company: VeraData

 

Awards Category: Best 'Alteryx for Good' Story

 

We provide deep analytics services for hundreds of clients.  Of particular interest is the NCCS (National Childrens Cancer Society).  This prestigious and highly respected organization has been doing more for the families of children with cancer since 1987 - yep, for almost 30 years.  We are honored to be serving them as a client.

 

Describe the problem you needed to solve 

NCCS, like every other large charity in America, sends out direct mail fundraising solicitations to support these families.  Like any other business has to spend money to acquire new customers, non-profit organizations spend money to acquire donors.  They were faced with a year over year trend of increasing donor acquisition costs and increasing costs to reactivate lapsed donors.   This was coupled with a concern was that there was a shrinking universe of potential donors who were willing to support their efforts.

 

Describe the working solution

Enter VeraData. Our initial engagement with NCCS was to build a donor acquisition model to reduce their costs to acquire donors, which subsequently reduces the cycle time to break-even on the investment in new donors. Concurrently, we developed a lapsed reactivation model that used tons of external, outside information to select from their audience of former donors the individuals most likely to donate again, therefore increasing the universe of marketable names while maintaining the cost to reactivate. Lastly, our third component was to uncover an expanded universe of individuals who had the propensity to support the NCCS. This meant identifying new data sets and determining which individuals would be profitable to pursue.

 

There were several methodologies deployed to achieve these goals. Our analytic team settled on a series of support vector machine models solving for response rate, donation amount, package and channel preferences, etc. All of the information in our arsenal was called upon to contribute to the final suite of algorithms used to identify the best audience. Using Alteryx, R, Tableau and our internal machine learning infrastructure, we were able to combine decades worth of client side data with decades worth of external data and output a blended master analytic database that accounted for full promotional and transactional history with all corresponding composite data on the individuals. This symphony achieved all of the goals, and then some.

 

Macro.png

 

workflow (2).png

Describe the benefits you have achieved

The client experienced a 24% reduction in their cost to acquire a donor, they were able to reactivate a much larger than anticipated volume of lapsed donors (some were inactive for over 15 years) and they discovered an entirely new set of list sources that are delivering a cost to acquire in line with their budget requirements. Mission accomplished.

 

Since that point, we have broadened the scope of our engagement and are solving for other things such as digital fundraising, mid-level and major donors. Wouldn't have been possible to do with the same speed and precision had we not been using Alteryx.

Author: Wayne Franklin, Student Experience Evaluation Officer

Company: Charles Darwin University

 

Awards Category: Most Time Saved

 

The time saved mainly effects my workload; this in turn allows me to work on other projects for the department which helps the overall organisation. Being a smaller organisation our resources are limited so any time saved makes a significant difference to our overall output. Using Alteryx has quite often saved days of manual work and it significantly reduces the risk of errors.

 

Describe the problem you needed to solve 

The issue we faced was how best to amalgamate multiple data tables into three new unique excel files to be used by a 3rd party survey tool - Blue eXplorance. While this doesn't sound too difficult, it becomes very time consuming when there are thousands of rows of data in each data source. Being a smaller educational institutional I am at present the only person that works with all the setting up, running and reporting of all the surveys within the university; spending a day or two stuck on setting up one survey can have a detrimental effect on other projects.

 

Describe the working solution

The old way of doing things: Download each of the data sources which included the full student and unit information for a given semester. This was followed by a series of pivot tables, copy pasting, creating new fields, making things more meaningful (i.e. change 'M' to Male - not much but supervisors like it better that way). Once all that was done I would eventually end up with clean unit, student and relationship files that are set up to be used for 3rd party survey software. This doesn't sound like much but is quite time consuming. I got pretty good at excel formulas which helped cut the time down a little, but still took a day messing around in excel to get the final product. The new way to do things: click run on the Alteryx app I made, wait a minute, done! The app I created allows me to select the files to upload and where to save the output files at the end.

 

Describe the benefits you have achieved

What started as a solid day or two work is now reduced to a minute wait time as the Alteryx app is running. This frees me up to continue work on other projects I am working on and be a more productive member of our team.

Author: Mark Frisch (@MarqueeCrew), CEO

Company: MarqueeCrew

 

Awards Category: Name Your Own - Macros for the Good of All Alteryx Users

 

Describe the problem you needed to solve 

Creation of samples goes beyond random and creating N'ths.  It is crucial that samples be representative of their source populations if you are going to draw any meaningful truth from your marketing or other use cases.  After creating a sample set, how would you verify that you didn't select too many of one segment vs another?  If you're using Mosaic (r) data and there are 71 types to consider did you get enough of each type?

 

image004.png

 

Describe the working solution

Using a chi-squared test, we created a macro and published the macro to the Alteryx Macro District as well as to the CReW macros  (www.chaosreignswithin).  There are two input anchors (Population and Sample) and the configuration requires that you select a categorical variable from both inputs (the same variable content).  The output is a report that tells you if your representative or not (includes degrees of freedom and the Chi square results against a 95% confidence interval).

 

image005.jpg

 

Describe the benefits you have achieved

My client was able to avoid the costly mistake that had plagued their prior marketing initiative and was setup for success.  I wanted to share this feature with the community.  It would be awesome if it ended up helping my charity, the American Cancer Society.  Although this isn't quite as sexy as my competition, it is sexy in it's simplicity and geek factor.

 

image006.jpg

Author: Aaron Harter (@aaronharter), Media Ops Manager

Company: Quigley-Simpson

 

Awards Category: Best Use of Alteryx Server

 

We leverage our Alteryx Server to design and implement custom apps that allow for any team member at the Agency to benefit from the power of Alteryx, without the programming knowledge necessary to construct a solution on their own.  Analytic apps allow for all employees at Q-S to leverage the capabilities of Alteryx in a fun and easy to use interface.

 

1- QS Gallery Collections.jpg

 

Describe the problem you needed to solve 

Any company can own, buy or hold data. Finding creative applications to use data to drive informed decision making and find opportunities in a market is what separates the wheat from the chaff, regardless of industry.

 

Quigley-Simpson is an advertising agency in the highly fragmented media industry and the unique problems include managing rapidly changing marketplaces with dozens of disparate data sets and supporting many teams with varying reporting needs. The Media Operations team has been tasked to implement custom solutions to improve efficiency and make sense out of the big data coming in the agency.

 

Media measurement is highly reliant on quality data sourcing, blending and modeling, and we have been able to use Alteryx as a centralized environment for handling and processing all of this data across many formats. We have worked closely with key stakeholders in each department to automate away all of their "pain points" relating to data and reporting and interacting with our media buying system.

 

Describe the working solution

Some of our apps join our media buy, audience delivery history with our client's first party data and the related third party audience measurement data from Nielsen. Other third party data sources we leverage include Digital and Social Media metrics, GfK MRI demographic and psychographic market research, TIVO TRA set-top box data combined with shopper loyalty data, MediaTools authorizations and strategic planning on the brand level, AdTricity digital feedback on pre-, mid-, and post- roll online video campaigns, and comScore digital metrics for website activity.

 

2 - QS App Design.JPG

 

Expediting the processing, summarizing, cross-tabbing and formatting of these data sets has added an element of standardization to our reporting which did not exist previously while improving the speed and accuracy. An app we built for the one of our teams produces over 50 reports, ready for distribution, in less than 3 min, replacing a process that used to take a full day to accomplish.

 

3 - QS Top 20 Data Blending Workflow.JPG

 

Additionally, we are using spatial tools to analyze delivery and performance of pilot Programmatic Television test, which aggregates local market TV inventory to represent a national footprint. Several of our workflows blend and prep data for visualization on our in-house "Data Intelligence Platform" which is powered by Tableau. This is then used by our media planners and buyers to optimize campaigns to meet goals and exceed client expectations.

 

The flexibility to build out apps or dashboards, depending on the needs statement of the end user, has been phenomenal and very well received at the Agency.

 

4 - QS Automaded Reporting Model.JPG

 

Describe the benefits you have achieved

Now that we are an Alteryx organization, we are replacing all of our outdated processes and procedures with gracefully simple workflows that are propelling the Agency to the forefront of technology and automation. Our report generating apps have improved the accuracy, reliability, and transparency of our reporting. The log processing apps have saved thousands of hours of manual data entry. Now that our workforce has been liberated from these time consuming, monotonous tasks, we are wholly focused on growing our clients' business while better understanding marketplace conditions.

 

Streamlining the workflow processes has allowed for drastically reduced on-boarding times while maintaining data integrity and improving accuracy. It has been a primary goal to give all employees the tools to increase their knowledge base and grow their careers by improving the access to data they use for daily decision making, a goal we are achieving thanks in large part to our Alteryx Server.

 

2016 Alteryx Server app totals (as of 4/22/16):

  • Teams using apps = 7
  • Number of apps = 44
  • 2016 app run count = 1,794
  • 2016 time savings = 4,227 hours