community
cancel
Showing results for 
Search instead for 
Did you mean: 

Past Analytics Excellence Awards

Suggest an idea

Author: Andy Kriebel (@VizWizBI), Head Coach

Company: The Information Lab

 

Awards Category: Best 'Alteryx for Good' Story

 

The Connect2Help 211 team outlined their requirements, including review the database structure and what they were looking for as outputs. Note that this was also the week that we introduced Data School to Alteryx. We knew that the team could use Alteryx to prepare, cleanse and analyse the data. Ultimately, the team wanted to create a workflow in Alteryx that Connect2Help 211 could use in the future.

 

Ann Hartman, Director of Connect2Help 211 summarized the impact best: "We were absolutely blown away by your presentation today. This is proof that a small group of dedicated people working together can change an entire community. With the Alteryx workflow and Tableau workbooks you created, we can show the community what is needed where, and how people can help in their communities."

 

The entire details of the project can be best found here - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the problem you needed to solve 

In July 2015, Connect2Help 211, an Indianapolis-based non-profit service that facilitates connections between people who need human services and those who provide them, reached out to the Tableau Zen Masters as part of a broader effort that the Zens participate in for the Tableau Foundation. Their goals and needs were simple: Create an ETL process that extracts Refer data, transforms it, and loads it into a MYSQL database that can be connected to Tableau.

 

Describe the working solution

Alteryx-Workflow-211.png

 

See the workflow and further details in the blog post - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the benefits you have achieved

While the workflow looks amazingly complex, it absolutely accomplished the goal of creating a reusable ETL workflow. Ben Moss kicked off the project presentations by taking the Connect2Help 211 team through what the team had to do and how Connect2Help 211 could use this workflow going forward.

 

From there, the team went through the eight different visualisation that they created in Tableau. Keep in mind, Connect2Help 211 wasn't expecting any visualisations as part of the output, so to say they were excited with what the team created in just a week is a massive understatement.

 

Anuka.png

Author: Andy Kriebel (@VizWizBI), Head Coach

Company: The Information Lab

 

Awards Category: Best 'Alteryx for Good' Story

 

The Connect2Help 211 team outlined their requirements, including review the database structure and what they were looking for as outputs. Note that this was also the week that we introduced Data School to Alteryx. We knew that the team could use Alteryx to prepare, cleanse and analyse the data. Ultimately, the team wanted to create a workflow in Alteryx that Connect2Help 211 could use in the future.

 

Ann Hartman, Director of Connect2Help 211 summarized the impact best: "We were absolutely blown away by your presentation today. This is proof that a small group of dedicated people working together can change an entire community. With the Alteryx workflow and Tableau workbooks you created, we can show the community what is needed where, and how people can help in their communities."

 

The entire details of the project can be best found here - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the problem you needed to solve 

In July 2015, Connect2Help 211, an Indianapolis-based non-profit service that facilitates connections between people who need human services and those who provide them, reached out to the Tableau Zen Masters as part of a broader effort that the Zens participate in for the Tableau Foundation. Their goals and needs were simple: Create an ETL process that extracts Refer data, transforms it, and loads it into a MYSQL database that can be connected to Tableau.

 

Describe the working solution

Alteryx-Workflow-211.png

 

See the workflow and further details in the blog post - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the benefits you have achieved

While the workflow looks amazingly complex, it absolutely accomplished the goal of creating a reusable ETL workflow. Ben Moss kicked off the project presentations by taking the Connect2Help 211 team through what the team had to do and how Connect2Help 211 could use this workflow going forward.

 

From there, the team went through the eight different visualisation that they created in Tableau. Keep in mind, Connect2Help 211 wasn't expecting any visualisations as part of the output, so to say they were excited with what the team created in just a week is a massive understatement.

 

Anuka.png

Author: Rana Dalbah, Director - Workforce Intelligence & Processes

Company: BAE Systems

 

Awards Category: Most Unexpected Insight - Best Use Case for Alteryx in Human Resources

 

Working in Human Resources, people do not expect us to be technology savvy, let alone become technology leaders and host a "Technology Day" to show HR and other functions the type of technology that we are leveraging and how it has allowed us, as a team, to become more efficient and scalable.

 

Within the Workforce Intelligence team, a team responsible for HR metrics and analytics, we have been able to leverage Alteryx in a way that has allowed us to become more scalable and not "live in the data", spending the majority of our time formatting, cleansing, and re-indexing. For example, Alteryx replaced both Microsoft T-SQL and coding in R for our HR Dashboard, which allowed us to decrease the pre-processing time of our HR Dashboard from 8-16 hours per month to less than 10 minutes per month, which does not account for the elimination of human intervention and error.

 

With the time savings due to Alteryx, it has allowed us to create custom metrics in the dashboard at a faster rate to meet customer demands. In addition, it has also given us the opportunity to pursue other aspects of Alteryx forecast modeling, statistical analysis, predictive analytics, etc. The fact that we are able to turn an HR Dashboard around from one week to two days has been a game changer.

 

The HR dashboard is considered to have relevant data that is constantly being used for our Quarterly Business Reviews and has attracted the attention of our CEO and the Senior Leadership. Another use that we have found for Alteryx is to create a workflow for our Affirmative Action data processing. Our Affirmative Action process has lacked consistency over the years and has changed hands countless times, with no one person owning it for more than a year. After seeing the capabilities for our HR Dashboard, we decided to leverage Alteryx to create a workflow for our Affirmative Action processing that took 40 hours of work down to 7 minutes with an additional hour that allows for source data recognition 

recognition and correction.  We not only have been able to cut down a two or three month process to a few minutes, but we also now have a documented workflow that lists all the rules and exceptions for our process and would only need to be tweaked slightly as requirements change.

 

For our first foray into predictive analytics, we ran a flight risk model on a certain critical population.  Before Alteryx, the team used SPSS and R for the statistical analysis and created a Microsoft Access database to combine and process at least 30 data files.  The team was able to run the process, with predictive measures, in about 6 months.  After the purchase of Alteryx, the workflow was later created and refined in Alteryx, and we were able to run a small flight risks analysis on another subset of our population that took about a month with better visualizations than what R had to offer.  By reducing the data wrangling time, we are able to create models in a more timely fashion and the results are still relevant.

 

The biggest benefit of these time-savings is that it has freed up our analytics personnel to focus less on “data chores” and more on developing deeper analytics and making analytics more relevant to our executive leadership and our organization as a whole.  We’ve already become more proactive and more strategic now that we aren’t focusing our time on the data prep.  The combination of Alteryx with Tableau is transformative for our HR, Compensation, EEO-1, and Affirmative Action analytics.  Now that we are no longer spending countless hours prepping data, we’re assisting other areas, including Benefits, Ethics, Safety and Health, Facilities, and even our Production Engineering teams with ad-hoc analytics processing.

 

Describe the problem you needed to solve 

A few years ago, HR metrics was a somewhat foreign concept for our Senior Leadership. We could barely get consensus on the definition of headcount and attrition.  But in order for HR to bring to the table what Finance and Business Development do: metrics, data, measurements, etc. we needed to find a way to start displaying relevant HR metrics that can steer Senior Leadership in the right direction when making decisions for the workforce.  So, even though we launched with an HR Dashboard in January of 2014, it was simple and met minimum requirements, but it was a start. We used Adobe, Apache code and SharePoint, along with data in excel files, to create simple metrics and visuals. In April 2015, we launched the HR Dashboard using Tableau with the help of a third party that used Microsoft SQL server to read the data and visualize it based on our requirements. However, this was not the best solution for us because we were not able to make dynamic changes to the dashboard in a fast timeframe. The dashboard was being released about two weeks after fiscal month end, which is an eternity in terms of relevance to our Senior Leadership.  

 

Once we had the talent in-house, we were able to leverage our technological expertise in Tableau and then, with the introduction of Alteryx, create our workflows that cut down a 2 week process into a few days - including data validation and dashboard distribution to the HR Business Partners and Senior Leadership.  But why stop there?  We viewed Alteryx as a way to help refine existing manual processes: marrying multiple excel files using vlookups, pivot tables, etc. that were not necessarily documented by the users and cut back on processing time. If we can build it once and spend minimal time maintaining the workflow, why not build it?  This way, all one has to do in the future is append or replace a file and hit the start button, and the output is created.  Easy peasy! That is when we decided we can leverage this tool for our compliance team and build out the Affirmative Action process, described above, along with the EE0-1 and Vets processing.

 

What took months and multiple resources now takes minutes and only one resource.

 

Describe the working solution

The majority of the data we are using comes from our HCM (Human Capital Management Database) in excel based files. In addition to the HCM files, we are also using files from our applicant tracking system (ATS), IMPACT Awards data, Benefit provider, 401K, Safety and Health data, and pension providers.

 

Anything that does not come out of our HCM system are coming from a third party vendor. These files are used specifically for our HR dashboard, Affirmative Action Plan workflow, Safety & Health Dashboard, and our benefits dashboard.

 

In addition to dashboards, we have been able to leverage the mentioned files along with survey data and macro-economic factors for our flight risk model. We have also leveraged Google map data to calculate the commute time from an employee's home zip code to their work location zip code. This was a more accurate measurement of time spent on the road to and from work when compared to distance.

 

The ultimate outputs vary: an HR dashboard that tracks metrics such as demographics, headcount, attrition, employee churn/movement, rewards and exit surveys is published as a Tableau workbook. The Flight Risk analysis that allows us to determine what factors most contribute to certain populations leaving the company; a compensation dashboard that provided executives a quick way to do merit and Incentive Compensation planning includes base pay, pay ratios, etc. is also published as a Tableau Workbook.

 

This workflow has as its input our employee roster file, which includes the employee’s work location and supervisor identifiers and work locations going up to their fourth level supervisor.  For the first step of processing, we used stacked-joins to establish employee’s supervisor hierarchies up to the 8th level supervisor.  We then needed to assign initial “starting location” for an employee based on the location type.  That meant “rolling up” the employee’s location until we hit an actual company, not client, site.  We did this because Affirmative Action reporting requires using actual company sites.  The roll-up was accomplished using nested filters, which is easier to see, understand, modify, and track than a large ELSEIF function (important for team sharing). 

 

Once the initial location rollup was completed, we then needed to rollup employees until every employee was at a site with at least 40 employees.  While simply rolling all employees up at once would be quick, it would also result in fewer locations and many employees being rolled up too far from their current site which would undermine the validity and effectiveness of our Affirmative Action plan.  Instead, we used a slow-rolling aggregate sort technique, where lone employees are rolled up into groups of two, then groups of two are rolled up into larger groups, and so on until sites are determined with a minimum of 40 employees (or whatever number is input).  The goal is to aggregate employees effectively, while minimizing the “distance” of the employee from their initial site.  This sorting was accomplished using custom-built macros with a group size control input that can be quickly changed by anyone using the workflow.

 

The end result was the roster of employees with the original data, with new fields identifying their roll-up location, and what level of roll-up from their initial location was needed.  A small offshoot of “error” population (usually due to missing or incorrect data) is put into a separate file for later iterative correction.

 

Previously, this process was done through trial and error via Access, and Excel.  That process, was not only much slower and more painstaking, but it also tended to result in larger “distances” of employees from initial sites then was necessary.  As a result, our new process is quicker, less error-prone, and arguably more defensible than its predecessor.

 

image001.png

 

One of the Macros used in AAP:

 

image002.png

 

Describe the benefits you have achieved

Alteryx has enabled our relatively small analytics shop (3 people) to build a powerful, flexible and scalable analytics infrastructure without working through our IT support.  We are independent and thus can reply to the user's custom requests in a timely fashion.  We are seen as agile and responsive - creating forecasting workflows in a few days to preview to our CEO and CHRO instead of creating Power Point slides to preview for them a concept.  This way, we can show them what we expect it to look like and how it will work and any feedback they give us, we can work at the same time to meet their requirements.  The possibilities of Alteryx, in our eyes, are endless and for a minimal investment, we are constantly "wowing" our customers with the service and products we are providing them.  In the end, we have been successful in showing that HR can leverage the latest technologies to become more responsive to business needs without the need for IT or developer involvement.

Author: Rana Dalbah, Director - Workforce Intelligence & Processes

Company: BAE Systems

 

Awards Category: Most Unexpected Insight - Best Use Case for Alteryx in Human Resources

 

Working in Human Resources, people do not expect us to be technology savvy, let alone become technology leaders and host a "Technology Day" to show HR and other functions the type of technology that we are leveraging and how it has allowed us, as a team, to become more efficient and scalable.

 

Within the Workforce Intelligence team, a team responsible for HR metrics and analytics, we have been able to leverage Alteryx in a way that has allowed us to become more scalable and not "live in the data", spending the majority of our time formatting, cleansing, and re-indexing. For example, Alteryx replaced both Microsoft T-SQL and coding in R for our HR Dashboard, which allowed us to decrease the pre-processing time of our HR Dashboard from 8-16 hours per month to less than 10 minutes per month, which does not account for the elimination of human intervention and error.

 

With the time savings due to Alteryx, it has allowed us to create custom metrics in the dashboard at a faster rate to meet customer demands. In addition, it has also given us the opportunity to pursue other aspects of Alteryx forecast modeling, statistical analysis, predictive analytics, etc. The fact that we are able to turn an HR Dashboard around from one week to two days has been a game changer.

 

The HR dashboard is considered to have relevant data that is constantly being used for our Quarterly Business Reviews and has attracted the attention of our CEO and the Senior Leadership. Another use that we have found for Alteryx is to create a workflow for our Affirmative Action data processing. Our Affirmative Action process has lacked consistency over the years and has changed hands countless times, with no one person owning it for more than a year. After seeing the capabilities for our HR Dashboard, we decided to leverage Alteryx to create a workflow for our Affirmative Action processing that took 40 hours of work down to 7 minutes with an additional hour that allows for source data recognition 

recognition and correction.  We not only have been able to cut down a two or three month process to a few minutes, but we also now have a documented workflow that lists all the rules and exceptions for our process and would only need to be tweaked slightly as requirements change.

 

For our first foray into predictive analytics, we ran a flight risk model on a certain critical population.  Before Alteryx, the team used SPSS and R for the statistical analysis and created a Microsoft Access database to combine and process at least 30 data files.  The team was able to run the process, with predictive measures, in about 6 months.  After the purchase of Alteryx, the workflow was later created and refined in Alteryx, and we were able to run a small flight risks analysis on another subset of our population that took about a month with better visualizations than what R had to offer.  By reducing the data wrangling time, we are able to create models in a more timely fashion and the results are still relevant.

 

The biggest benefit of these time-savings is that it has freed up our analytics personnel to focus less on “data chores” and more on developing deeper analytics and making analytics more relevant to our executive leadership and our organization as a whole.  We’ve already become more proactive and more strategic now that we aren’t focusing our time on the data prep.  The combination of Alteryx with Tableau is transformative for our HR, Compensation, EEO-1, and Affirmative Action analytics.  Now that we are no longer spending countless hours prepping data, we’re assisting other areas, including Benefits, Ethics, Safety and Health, Facilities, and even our Production Engineering teams with ad-hoc analytics processing.

 

Describe the problem you needed to solve 

A few years ago, HR metrics was a somewhat foreign concept for our Senior Leadership. We could barely get consensus on the definition of headcount and attrition.  But in order for HR to bring to the table what Finance and Business Development do: metrics, data, measurements, etc. we needed to find a way to start displaying relevant HR metrics that can steer Senior Leadership in the right direction when making decisions for the workforce.  So, even though we launched with an HR Dashboard in January of 2014, it was simple and met minimum requirements, but it was a start. We used Adobe, Apache code and SharePoint, along with data in excel files, to create simple metrics and visuals. In April 2015, we launched the HR Dashboard using Tableau with the help of a third party that used Microsoft SQL server to read the data and visualize it based on our requirements. However, this was not the best solution for us because we were not able to make dynamic changes to the dashboard in a fast timeframe. The dashboard was being released about two weeks after fiscal month end, which is an eternity in terms of relevance to our Senior Leadership.  

 

Once we had the talent in-house, we were able to leverage our technological expertise in Tableau and then, with the introduction of Alteryx, create our workflows that cut down a 2 week process into a few days - including data validation and dashboard distribution to the HR Business Partners and Senior Leadership.  But why stop there?  We viewed Alteryx as a way to help refine existing manual processes: marrying multiple excel files using vlookups, pivot tables, etc. that were not necessarily documented by the users and cut back on processing time. If we can build it once and spend minimal time maintaining the workflow, why not build it?  This way, all one has to do in the future is append or replace a file and hit the start button, and the output is created.  Easy peasy! That is when we decided we can leverage this tool for our compliance team and build out the Affirmative Action process, described above, along with the EE0-1 and Vets processing.

 

What took months and multiple resources now takes minutes and only one resource.

 

Describe the working solution

The majority of the data we are using comes from our HCM (Human Capital Management Database) in excel based files. In addition to the HCM files, we are also using files from our applicant tracking system (ATS), IMPACT Awards data, Benefit provider, 401K, Safety and Health data, and pension providers.

 

Anything that does not come out of our HCM system are coming from a third party vendor. These files are used specifically for our HR dashboard, Affirmative Action Plan workflow, Safety & Health Dashboard, and our benefits dashboard.

 

In addition to dashboards, we have been able to leverage the mentioned files along with survey data and macro-economic factors for our flight risk model. We have also leveraged Google map data to calculate the commute time from an employee's home zip code to their work location zip code. This was a more accurate measurement of time spent on the road to and from work when compared to distance.

 

The ultimate outputs vary: an HR dashboard that tracks metrics such as demographics, headcount, attrition, employee churn/movement, rewards and exit surveys is published as a Tableau workbook. The Flight Risk analysis that allows us to determine what factors most contribute to certain populations leaving the company; a compensation dashboard that provided executives a quick way to do merit and Incentive Compensation planning includes base pay, pay ratios, etc. is also published as a Tableau Workbook.

 

This workflow has as its input our employee roster file, which includes the employee’s work location and supervisor identifiers and work locations going up to their fourth level supervisor.  For the first step of processing, we used stacked-joins to establish employee’s supervisor hierarchies up to the 8th level supervisor.  We then needed to assign initial “starting location” for an employee based on the location type.  That meant “rolling up” the employee’s location until we hit an actual company, not client, site.  We did this because Affirmative Action reporting requires using actual company sites.  The roll-up was accomplished using nested filters, which is easier to see, understand, modify, and track than a large ELSEIF function (important for team sharing). 

 

Once the initial location rollup was completed, we then needed to rollup employees until every employee was at a site with at least 40 employees.  While simply rolling all employees up at once would be quick, it would also result in fewer locations and many employees being rolled up too far from their current site which would undermine the validity and effectiveness of our Affirmative Action plan.  Instead, we used a slow-rolling aggregate sort technique, where lone employees are rolled up into groups of two, then groups of two are rolled up into larger groups, and so on until sites are determined with a minimum of 40 employees (or whatever number is input).  The goal is to aggregate employees effectively, while minimizing the “distance” of the employee from their initial site.  This sorting was accomplished using custom-built macros with a group size control input that can be quickly changed by anyone using the workflow.

 

The end result was the roster of employees with the original data, with new fields identifying their roll-up location, and what level of roll-up from their initial location was needed.  A small offshoot of “error” population (usually due to missing or incorrect data) is put into a separate file for later iterative correction.

 

Previously, this process was done through trial and error via Access, and Excel.  That process, was not only much slower and more painstaking, but it also tended to result in larger “distances” of employees from initial sites then was necessary.  As a result, our new process is quicker, less error-prone, and arguably more defensible than its predecessor.

 

image001.png

 

One of the Macros used in AAP:

 

image002.png

 

Describe the benefits you have achieved

Alteryx has enabled our relatively small analytics shop (3 people) to build a powerful, flexible and scalable analytics infrastructure without working through our IT support.  We are independent and thus can reply to the user's custom requests in a timely fashion.  We are seen as agile and responsive - creating forecasting workflows in a few days to preview to our CEO and CHRO instead of creating Power Point slides to preview for them a concept.  This way, we can show them what we expect it to look like and how it will work and any feedback they give us, we can work at the same time to meet their requirements.  The possibilities of Alteryx, in our eyes, are endless and for a minimal investment, we are constantly "wowing" our customers with the service and products we are providing them.  In the end, we have been successful in showing that HR can leverage the latest technologies to become more responsive to business needs without the need for IT or developer involvement.

Author: Alexandra Wiegel, Tax Business Intelligence Analyst In-2C-14px.png
Company: Comcast Corp


Awards Category: Best Business ROI

 

A Corporate Tax Department is not typically associated with a Business Intelligence team sleekly manipulating and mining large data sources for insights.  Alteryx has allowed our Tax Business Intelligence team to provide incredibly useful insight to several branches of our larger Tax Department. Today, almost all of our data is in Excel or csv format and so data organization, manipulation and analysis have previously been accomplished within the confines of Excel, with the occasional Tableau for visualization. Alteryx has given us the ability to analyze, organize, and manipulate very large amounts of data from multiple sources.  Alteryx is exactly what we need to solve our colleague’s problems.


Describe the problem you needed to solve

Several weeks ago we were approached about using Alteryx to do a discovery project that would hopefully provide our colleagues further insight into the application of tax codes to customer bills. Currently, our Sales Tax Team uses two different methods to apply taxes to two of our main products respectively. The first method is to apply Tax Codes to customer bill records and then run those codes through software that generates and applies taxes to each record. The second method is more home-grown and appears to be leading to less consistent taxability on this side of our business.

 

Given that we sell services across the entire country, we wanted to explore standardization across all our markets. So, our Sales Tax team tasked us with creating a workflow that would compare the two different methods and develop a plan towards the goal of standardization and the effect it would have on every customer’s bills.

 

Describe the working solution

Our original source file was a customer level report where the records were each item (products, fees, taxes, etc.) on a customer’s bill for every customer in a given location. As it goes with data projects, our first task was to cleanse, organize, and append the data to make it uniform.

 

21.PNG

 

The next step was to add in the data from several data sources that we would ultimately need in order to show the different buckets of customers according to the monetary changes of their bills. Since these sources were all formatted differently and there was often no unique identifier we could use to join new data sources to our original report. Hence, we had to create a method to ensure we did not create duplicate records when using the join function. We ended up using this process multiple times (pictured below)

 

22.PNG

 

And so, the workflow followed. We added tax descriptions, new codes, and other information. We added calculated fields to determine the amount of tax that should be owed by each customer today, based on our current coding methods.

 

23.PNG

24.PNG

25.PNG

26.PNG

26.PNG

 

After we had layered in all the extra data that we would need to create our buckets, we distinguished between the two lines of business and add in the logic to determine which codes, at present, are taxable.

 

28.PNG

 

For the side of our business whose taxability is determine by software, you will notice that the logic is relatively simple. We added in our tax codes using the same joining method as we did above and then used a single join to a table that lists the taxable codes.

 

29.PNG

 

For the side of our business whose taxability is determine by using our home-grown method, you can see below that the logic is more complicated. Currently, the tax codes for this line of business are listed in such a way that requires us to parse a field and stack the resulting records in order to isolate individual codes. Once we have done this we can then apply the taxability portion. We then have to use this as a lookup for the actual record in order to determine if a record contains within the code column a tax code that has been marked as taxable. Or in other words, to apply our home-grown taxability logic is complicated, time consuming, and leaves much room for error.

 

210.PNG

 

Once we stacked all this data back together we joined it with the new tax code table. This will give us the new codes so that the software can be used for both lines of business. Once we know these new codes, we can simulate the process of the software and determine which of the new codes will be taxable.

 

211.PNG

 

Knowing whether or not codes are taxable helps us hypothesize about how problematic a geographic location may end up being for our team, but it does not tell us the dollar amount of taxes that will be changing. To know this we must output files that will be run through the real software.

 

Hence, once we have completed the above data manipulation, cleansing, and organization, we extract the data that we want to have run through the software and reformat the records to match the necessary format for the software recognition.

 

212.PNG

213.PNG

 

We created the above two macros to reformat the columns in order to simply this extensive workflow. Pictured below is the top macro. The difference between the two resides in the first select tool where we have specified different fields to be output.

 

214.PNG

 

After the reformatting, we output the files and send them to the software team.

 

215.PNG

216.PNG

 

When the data is returned to us, we will be able to determine the current amount of tax that is being charged to each customer as well the amount that will be charged once the codes are remapped. The difference between these two will then become our buckets of customers and our Vice President can begin to understand how the code changes will affect our customer’s bills.

 

Describe the benefits you have achieved

Although this project took several weeks to build in Alteryx, it was well worth the time invested as we will be able to utilize it for any other locations. We have gained incredible efficiency in acquiring insight on this standardization project using Alteryx. Another benefit we have seen in Alteryx is the flexibility to make minor changes to our workflow which has helped us easily customize for different locations. All of the various Alteryx tools have made it possible for the Tax Business Intelligence team to assist the Tax Department in accomplishing large data discovery projects such as this.

 

Further, we have begun creating an Alteryx app that can be run by anyone in our Tax Department. This frees up the Tax Business Intelligence team to work on other important projects that are high priority.

A common benefit theme amongst Alteryx users is that Alteryx workflows save companies large amounts of time in data manipulation and organization. Moreover, Alteryx has made it possible (where it is impossible in Excel) to handle large and complicated amounts of data and in a very user friendly environment. Alteryx will continue to be a very valuable tool which the Tax Business Intelligence team will use to help transform the Tax department into a more efficient, more powerful, and more unified organization in the coming years.

 

How much time has your organization saved by using Alteryx workflows?

We could never have done this data discovery project without using Alteryx.  It was impossible to create any process within Excel given the quantity and complexity of the data.

 

In other projects, we are able to replicate Excel reconciliation processes that are run annually, quarterly, and monthly in Alteryx.  The Alteryx workflows have saved our Tax Department weeks of manual Excel pivot table work.  Time savings on individual projects can range from a few hours to several weeks.

 

What has this time savings allowed you to do?

The time savings has been invaluable.  The Tax Department staff are now able to free themselves of the repetitive tasks in Excel, obtain more accurate results and spend time doing analysis and understanding the results of the data.  The “smarter” time spent to do analyses will help transform the Tax Department with greater opportunities to further add value to the company.

Author: Alexandra Wiegel, Tax Business Intelligence Analyst In-2C-14px.png
Company: Comcast Corp


Awards Category: Best Business ROI

 

A Corporate Tax Department is not typically associated with a Business Intelligence team sleekly manipulating and mining large data sources for insights.  Alteryx has allowed our Tax Business Intelligence team to provide incredibly useful insight to several branches of our larger Tax Department. Today, almost all of our data is in Excel or csv format and so data organization, manipulation and analysis have previously been accomplished within the confines of Excel, with the occasional Tableau for visualization. Alteryx has given us the ability to analyze, organize, and manipulate very large amounts of data from multiple sources.  Alteryx is exactly what we need to solve our colleague’s problems.


Describe the problem you needed to solve

Several weeks ago we were approached about using Alteryx to do a discovery project that would hopefully provide our colleagues further insight into the application of tax codes to customer bills. Currently, our Sales Tax Team uses two different methods to apply taxes to two of our main products respectively. The first method is to apply Tax Codes to customer bill records and then run those codes through software that generates and applies taxes to each record. The second method is more home-grown and appears to be leading to less consistent taxability on this side of our business.

 

Given that we sell services across the entire country, we wanted to explore standardization across all our markets. So, our Sales Tax team tasked us with creating a workflow that would compare the two different methods and develop a plan towards the goal of standardization and the effect it would have on every customer’s bills.

 

Describe the working solution

Our original source file was a customer level report where the records were each item (products, fees, taxes, etc.) on a customer’s bill for every customer in a given location. As it goes with data projects, our first task was to cleanse, organize, and append the data to make it uniform.

 

21.PNG

 

The next step was to add in the data from several data sources that we would ultimately need in order to show the different buckets of customers according to the monetary changes of their bills. Since these sources were all formatted differently and there was often no unique identifier we could use to join new data sources to our original report. Hence, we had to create a method to ensure we did not create duplicate records when using the join function. We ended up using this process multiple times (pictured below)

 

22.PNG

 

And so, the workflow followed. We added tax descriptions, new codes, and other information. We added calculated fields to determine the amount of tax that should be owed by each customer today, based on our current coding methods.

 

23.PNG

24.PNG

25.PNG

26.PNG

26.PNG

 

After we had layered in all the extra data that we would need to create our buckets, we distinguished between the two lines of business and add in the logic to determine which codes, at present, are taxable.

 

28.PNG

 

For the side of our business whose taxability is determine by software, you will notice that the logic is relatively simple. We added in our tax codes using the same joining method as we did above and then used a single join to a table that lists the taxable codes.

 

29.PNG

 

For the side of our business whose taxability is determine by using our home-grown method, you can see below that the logic is more complicated. Currently, the tax codes for this line of business are listed in such a way that requires us to parse a field and stack the resulting records in order to isolate individual codes. Once we have done this we can then apply the taxability portion. We then have to use this as a lookup for the actual record in order to determine if a record contains within the code column a tax code that has been marked as taxable. Or in other words, to apply our home-grown taxability logic is complicated, time consuming, and leaves much room for error.

 

210.PNG

 

Once we stacked all this data back together we joined it with the new tax code table. This will give us the new codes so that the software can be used for both lines of business. Once we know these new codes, we can simulate the process of the software and determine which of the new codes will be taxable.

 

211.PNG

 

Knowing whether or not codes are taxable helps us hypothesize about how problematic a geographic location may end up being for our team, but it does not tell us the dollar amount of taxes that will be changing. To know this we must output files that will be run through the real software.

 

Hence, once we have completed the above data manipulation, cleansing, and organization, we extract the data that we want to have run through the software and reformat the records to match the necessary format for the software recognition.

 

212.PNG

213.PNG

 

We created the above two macros to reformat the columns in order to simply this extensive workflow. Pictured below is the top macro. The difference between the two resides in the first select tool where we have specified different fields to be output.

 

214.PNG

 

After the reformatting, we output the files and send them to the software team.

 

215.PNG

216.PNG

 

When the data is returned to us, we will be able to determine the current amount of tax that is being charged to each customer as well the amount that will be charged once the codes are remapped. The difference between these two will then become our buckets of customers and our Vice President can begin to understand how the code changes will affect our customer’s bills.

 

Describe the benefits you have achieved

Although this project took several weeks to build in Alteryx, it was well worth the time invested as we will be able to utilize it for any other locations. We have gained incredible efficiency in acquiring insight on this standardization project using Alteryx. Another benefit we have seen in Alteryx is the flexibility to make minor changes to our workflow which has helped us easily customize for different locations. All of the various Alteryx tools have made it possible for the Tax Business Intelligence team to assist the Tax Department in accomplishing large data discovery projects such as this.

 

Further, we have begun creating an Alteryx app that can be run by anyone in our Tax Department. This frees up the Tax Business Intelligence team to work on other important projects that are high priority.

A common benefit theme amongst Alteryx users is that Alteryx workflows save companies large amounts of time in data manipulation and organization. Moreover, Alteryx has made it possible (where it is impossible in Excel) to handle large and complicated amounts of data and in a very user friendly environment. Alteryx will continue to be a very valuable tool which the Tax Business Intelligence team will use to help transform the Tax department into a more efficient, more powerful, and more unified organization in the coming years.

 

How much time has your organization saved by using Alteryx workflows?

We could never have done this data discovery project without using Alteryx.  It was impossible to create any process within Excel given the quantity and complexity of the data.

 

In other projects, we are able to replicate Excel reconciliation processes that are run annually, quarterly, and monthly in Alteryx.  The Alteryx workflows have saved our Tax Department weeks of manual Excel pivot table work.  Time savings on individual projects can range from a few hours to several weeks.

 

What has this time savings allowed you to do?

The time savings has been invaluable.  The Tax Department staff are now able to free themselves of the repetitive tasks in Excel, obtain more accurate results and spend time doing analysis and understanding the results of the data.  The “smarter” time spent to do analyses will help transform the Tax Department with greater opportunities to further add value to the company.

Amway logo.jpgAuthor: Adam Rant (@Rant) - Business Systems Analyst

Team Members: Tom Madden, Jordan Howell, Brian Conrad, Megan Lilley & Sankar Mishra

Company: Amway

Business Partner: Marquee Crew (@MarqueeCrew)

 

Awards Category: From Zero to Hero

 

Global Procurement at Amway began its journey with Alteryx by purchasing 2-licenses in October of 2015. We generated instant value from this tool, and knew there was so much untapped potential. A few short months after our initial purchase, we started hosting internal Alteryx enablement sessions to spread the word throughout our organization. It wasn’t long before we were up to 10 licenses. As our scope continued to expand, IT got involved and pursued a mini-trial that included Server for the remainder of 2016. In 2017 we purchased the full year pilot to drive even greater user adoption. Today, we have over 30 users and growing!


As our user base and experience with Alteryx grows, we have evolved from diverse data blending to language translation and normalization. We are moving from old legacy tools like Access and Excel, into the modern world of Analytics with Tableau and Alteryx. We are structuring data that we once thought to be impossible, and even built out applications to search online E-bay and now Amazon. We are pushing into the world of predictive analytics. Models are being developed and geospatial tool-sets are being examined. Most of these were pipe dreams before we were introduced to Alteryx. Now we are making these things come to life and blazing a trail for Analytics at Amway.

 

Describe the problem you needed to solve

We found Alteryx through Tableau. It started with simple Data blending and Tableau data automation, but grew from there.

  • Automating Manual Scorecards/metrics
  • Translation Macro using Google Translate
  • E-Bay Web Scrap
  • Amazon Web Scrap
  • Commodity Predictive Modeling
  • Spatial

 

Describe the working solution

We are using a wide range of data sources including excel, access, SQL Server (In DB tools), Oracle (In DB tools), SharePoint, Google Sheets, E-bay, and Amazon. Most of our data sets are published directly to Tableau Server. We have Server up and running to automate most of our Tableau Dashboards. Alteryx Gallery for deploying Apps is our next project to tackle. 

 

Describe the benefits you have achieved

Alteryx is the engine that is driving our team to new levels. We are automating all of our scorecards from a data perspective. We are able to provide daily insights on the health of our supply chain verses monthly reporting. Here are a few of the major projects we accomplished inside of Alteryx.

 

  1. Automated over 20 data processes, eliminating over 350 hours of data prep, and saving over $80,000 annually.
    • These savings are simply based on time savings. Factor into it the ability to run these workflows daily and deliver insights to our users and this is very conservative.
    • These workflows are now reusable processing engines that we can continue to enhance and build off of.
  2. Using Alteryx we are able to automate the translation of data. We operate in over 100 countries. Using Alteryx we developed a workflow that can go through our data and translate it automatically using Google translate. We plan on deploying this on our Server as an App for others to leverage.
  3. Jordan Howell eliminated a custom Access Database that cost us $3 Million dollars to build. This process eliminates 40 hours a month in manual data preparation due to the database, and will save us $24,000 a year. In 3 weeks he was able to recreate the database in Alteryx.
  4. E-bay & Amazon web scrapping to effectively audit Amway products that are being sold on these sites. Before Alteryx we manually accomplished this, and we would only get 1-10% of the total products on the sites (We had trouble answering questions at a Macro level about how many products). After Alteryx we can do this in seconds and have 100% of the products. Allowing users to focus on delivering insights from the data and not having to pull it!

 

I1.png

 

 I2.png

 

 I3.png

 

 I4.png

 

I5.png

 

 I6.png

 

 

amway2.png

Catalyst logo.pngAuthor: Jason Claunch - President

Company: Catalyst

Business Partner: Slalom Consulting - Sean Hayward & Marek Koenig

 

Awards Category: Best Use of Alteryx for Spatial Analytics

 

The developed solution used many of the Spatial Analytics components available within Alteryx:

  • Trade Area – have user select target area to analyze
  • Spatial Match – combine multiple geospatial objects,
  • Intersection – cut objects from each other to create subject area
  • Grid tool – sub-divide the trade blocks to determine complete coverage of trade ring
  • Distance – use drivetime calculation to score and rank retailers in the vicinity

Describe the problem you needed to solve

Retail site analysis is a key part of our business and was taking up too much time with repetitive tasks that could have been easily automated.

 

Describe the working solution

To support selection of best-fit operators, Catalyst partnered with Slalom Consulting to develop a tool to identify potential uses to target for outreach and recruitment. Previously, we would have to manually build demographic profiles using tools like qGIS, ESRI, and others, but found the process to be cumbersome and quite repetitive. Demographic data was acquired at the trade bloc level, which was too granular for identify target locations and would not mesh well with the retail data.

 

Alteryx and its spatial capabilities was used in a few ways:

 

1) Minimize our retail data selection from the entire US to a selected state using the Spatial Match tool.catalyst1b.png

 

2) Create a demographic profile for each retail location that consisted of data points such as median income, population, daytime employees, and others. The data was aggregated around a 3 mile radius of the specific retail location with an Alteryx Macro composed of a Trade Area, Grid Tool, Spatial Match, and Summarization tool.

 

catalyst2-A.png catalyst2-B.png catalyst2-C.png

 

3) Using a Map Input, the user selected an area to profile and candidate retailers were output for further review.

catalyst3.png

 

4) After selecting specific retailers to do an in-depth analysis on, Alteryx would score all possible locations by distance (Drivetime Analysis) and by score (proprietary weighting of various demographic attributes). The profiled results were then used to build a client presentation; the automated profiling tool saved us countless hours and allowed us to deliver more detailed analysis for our clients.

 

catalyst4.png

 

Describe the benefits you have achieved

Using Alteryx was a massive time saver, the tool that we built took a process that normally required at least 8 hours of manual work down to merely a few minutes. This has directly benefited our bottom line by allowing us to focus on more key tasks in our client outreach and recruitment. A return-on-investment was immediately realized after we were able to close a deal with a major client using our new process.

Catalyst logo.pngAuthor: Jason Claunch - President

Company: Catalyst

Business Partner: Slalom Consulting - Sean Hayward & Marek Koenig

 

Awards Category: Best Use of Alteryx for Spatial Analytics

 

The developed solution used many of the Spatial Analytics components available within Alteryx:

  • Trade Area – have user select target area to analyze
  • Spatial Match – combine multiple geospatial objects,
  • Intersection – cut objects from each other to create subject area
  • Grid tool – sub-divide the trade blocks to determine complete coverage of trade ring
  • Distance – use drivetime calculation to score and rank retailers in the vicinity

Describe the problem you needed to solve

Retail site analysis is a key part of our business and was taking up too much time with repetitive tasks that could have been easily automated.

 

Describe the working solution

To support selection of best-fit operators, Catalyst partnered with Slalom Consulting to develop a tool to identify potential uses to target for outreach and recruitment. Previously, we would have to manually build demographic profiles using tools like qGIS, ESRI, and others, but found the process to be cumbersome and quite repetitive. Demographic data was acquired at the trade bloc level, which was too granular for identify target locations and would not mesh well with the retail data.

 

Alteryx and its spatial capabilities was used in a few ways:

 

1) Minimize our retail data selection from the entire US to a selected state using the Spatial Match tool.catalyst1b.png

 

2) Create a demographic profile for each retail location that consisted of data points such as median income, population, daytime employees, and others. The data was aggregated around a 3 mile radius of the specific retail location with an Alteryx Macro composed of a Trade Area, Grid Tool, Spatial Match, and Summarization tool.

 

catalyst2-A.png catalyst2-B.png catalyst2-C.png

 

3) Using a Map Input, the user selected an area to profile and candidate retailers were output for further review.

catalyst3.png

 

4) After selecting specific retailers to do an in-depth analysis on, Alteryx would score all possible locations by distance (Drivetime Analysis) and by score (proprietary weighting of various demographic attributes). The profiled results were then used to build a client presentation; the automated profiling tool saved us countless hours and allowed us to deliver more detailed analysis for our clients.

 

catalyst4.png

 

Describe the benefits you have achieved

Using Alteryx was a massive time saver, the tool that we built took a process that normally required at least 8 hours of manual work down to merely a few minutes. This has directly benefited our bottom line by allowing us to focus on more key tasks in our client outreach and recruitment. A return-on-investment was immediately realized after we were able to close a deal with a major client using our new process.

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

Audit Automation.PNG

 

Billing:

 

Billing.PNG

 

File Generator:

 

File Generator.PNG

Market Generator:

 

Market Data.PNG

 

Parse:

Parse.PNG

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

Audit Automation.PNG

 

Billing:

 

Billing.PNG

 

File Generator:

 

File Generator.PNG

Market Generator:

 

Market Data.PNG

 

Parse:

Parse.PNG

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Jennifer Jensen, Sr. Analyst In-2CRev-28px-R.pngand team members Inna Meerovich, RJ Summers

Company: mcgarrybowen 

 

mcgarrybowen is a creative advertising agency that is in the transformation business. From the beginning, mcgarrybowen was built differently, on the simple premise that clients deserve better. So we built a company committed to delivering just that. A company that believes, with every fiber of its being, that it exists to serve clients, build brands, and grow businesses.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Mcgarrybowen creates hundreds of pieces of social creative per year for Fortune 500 CPG and Healthcare brands, on platforms including Facebook and Twitter. The social media landscape is constantly evolving especially with the introduction of video, a governing mobile-first mindset, and interactive ad units like carousels, but yet the capabilities for measuring performance on the platforms have not followed as closely.

 

Our clients constantly want to know, what creative is the most effective, drives the highest engagement rates, and the most efficient delivery? What time of day, day of week is best for posting content? What copy and creative works best? On other brands you manage, what learnings have you had?

 

But, therein lies the challenge. Answers to these questions aren’t readily available in the platforms, which export Post-Level data in raw spreadsheets with many tabs of information. Both Facebook and Twitter can only export 90 days of data at a time. So, to look at client performance over longer periods of time and compared to their respective categories, and derive performance insights that drive cyclical improvements in creative – we turned to Alteryx.  

 

Describe the working solution

Our Marketing Science team first created Alteryx workflows that blended multiple quarters and spreadsheet tabs of social data for each individual client. The goal was to take many files over several years that each contained many tabs of information, and organize it onto one single spreadsheet so that it was easily visualized and manipulated within Excel and Tableau for client-level understanding. In Alteryx, it is easy to filter out all of the unnecessary data in order to focus on the KPIs that will help drive the success of the campaigns.  We used “Post ID,” or each post’s unique identifying number, as a unifier for all of the data coming in from all tabs, so all data associated with a single Facebook post was organized onto a single row.  After all of the inputs, the data was then able to be exported onto a single tab within Excel.

 

After each client’s data was cleansed and placed into a single Excel file, another workflow was made that combined every client’s individual data export into a master file that contained all data for all brands.  From this, we can easily track performance over time, create client and vertical-specific benchmarks, and report on data efficiently and effectively.

 

Single Client Workflow

mcgarrybowen1.png

 

Multi-Client Workflow

mcgarrybowen2.png

 

Describe the benefits you have achieved

Without Alteryx, it would take countless hours to manually work with the social data in 90 day increments and manipulate the data within Excel to mimic what the Alteryx workflow export does in seconds. With all of the saved time, we are able to spend more time on the analysis of these social campaigns.  Since we are able to put more time into thoughtful analysis, client satisfaction with deeper learnings has grown exponentially.  Not only do we report out on past performance, but we can look toward the future and more real-time information to better analyze and optimize.

Author: Jennifer Jensen, Sr. Analyst In-2CRev-28px-R.pngand team members Inna Meerovich, RJ Summers

Company: mcgarrybowen 

 

mcgarrybowen is a creative advertising agency that is in the transformation business. From the beginning, mcgarrybowen was built differently, on the simple premise that clients deserve better. So we built a company committed to delivering just that. A company that believes, with every fiber of its being, that it exists to serve clients, build brands, and grow businesses.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Mcgarrybowen creates hundreds of pieces of social creative per year for Fortune 500 CPG and Healthcare brands, on platforms including Facebook and Twitter. The social media landscape is constantly evolving especially with the introduction of video, a governing mobile-first mindset, and interactive ad units like carousels, but yet the capabilities for measuring performance on the platforms have not followed as closely.

 

Our clients constantly want to know, what creative is the most effective, drives the highest engagement rates, and the most efficient delivery? What time of day, day of week is best for posting content? What copy and creative works best? On other brands you manage, what learnings have you had?

 

But, therein lies the challenge. Answers to these questions aren’t readily available in the platforms, which export Post-Level data in raw spreadsheets with many tabs of information. Both Facebook and Twitter can only export 90 days of data at a time. So, to look at client performance over longer periods of time and compared to their respective categories, and derive performance insights that drive cyclical improvements in creative – we turned to Alteryx.  

 

Describe the working solution

Our Marketing Science team first created Alteryx workflows that blended multiple quarters and spreadsheet tabs of social data for each individual client. The goal was to take many files over several years that each contained many tabs of information, and organize it onto one single spreadsheet so that it was easily visualized and manipulated within Excel and Tableau for client-level understanding. In Alteryx, it is easy to filter out all of the unnecessary data in order to focus on the KPIs that will help drive the success of the campaigns.  We used “Post ID,” or each post’s unique identifying number, as a unifier for all of the data coming in from all tabs, so all data associated with a single Facebook post was organized onto a single row.  After all of the inputs, the data was then able to be exported onto a single tab within Excel.

 

After each client’s data was cleansed and placed into a single Excel file, another workflow was made that combined every client’s individual data export into a master file that contained all data for all brands.  From this, we can easily track performance over time, create client and vertical-specific benchmarks, and report on data efficiently and effectively.

 

Single Client Workflow

mcgarrybowen1.png

 

Multi-Client Workflow

mcgarrybowen2.png

 

Describe the benefits you have achieved

Without Alteryx, it would take countless hours to manually work with the social data in 90 day increments and manipulate the data within Excel to mimic what the Alteryx workflow export does in seconds. With all of the saved time, we are able to spend more time on the analysis of these social campaigns.  Since we are able to put more time into thoughtful analysis, client satisfaction with deeper learnings has grown exponentially.  Not only do we report out on past performance, but we can look toward the future and more real-time information to better analyze and optimize.

Author: Kiran Ramakrishnan

 

Awards Category: Most Time Saved 

 

Through automating processes we received a lot of management attention and a desire to create more automated and on-demand analysis, dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

 

BBB_definitions.PNG

 

Describe the problem you needed to solve 

We are a semiconductor company located in the Silicon Valley. We are in business for more than 30 years with 45 locations globally and about 5000 employees. We are in business to solve our customers' challenges. We are a leader in driving innovations in particular for Microcontrollers. The company focuses on markets embedded processing, security, wireless, and touch technologies. In Automotive we provide solutions beyond touch such Remote keyless or networking. Our emphasize is IoT applications. We see a potential in the Internet of Things market combining our products especially MCUs, Security and Wireless Technologies.

 

In this industry, planning is essential as the market is very dynamic and volatile but manufacturing cycles are long. Most electronic applications have comparatively short product life cycles and sharp production ramp cycles. Ignoring these ramps could result in over/under capacity. For a semiconductor company it is key to clearly understand these dynamics and take appropriate actions within an acceptable time.

 

To forecast and make appropriate predictions, organizations need critical information such as actual forecast, billing, backlog and bookings. Based on this information Sales, BUs and Finance are able to build models. As End of Life parts convert immediately into revenue we need to treat them separately. Typically semiconductors sales is based on sales commission. Sales commissions are calculated on product category and type. Therefore each line item needs to be matched to a salesperson by product life cycle. In public companies this is done on a quarterly basis and regular updates increase an organization's confidence to achieve set goals. As electronic companies are demanding more and more security levels to data access, consolidated dataset needs to be protected to ensure compliance with customer agreements. Large organizations also require data security to ensure data is only accessible on a need-to-know basis.

 

user_guide.gif

Historically, people from these different groups manually created, cleansed and merged data and information into various files and sources to achieve insight. It is common to use different environments such as Oracle DBs, SAP, ModelN, SharePoint, Salesforce, Excel, and Access. This is extremely time consuming and requires a huge manual effort. Usually data consistency between different sources is not guaranteed and requires additional cleansing and manipulation. As every person/group has also their own way to gathering and consolidating this information it typically leads to different results and layout as it is hard for someone outside the group to clearly understand the other person's approach. These reports are regularly necessary a necessity and need to be complied on a weekly/daily basis on the refresh frequencies.  We also want to get independent of resources to update dashboards on demand. Current process makes the reporting heavily reliant on human resources.

 

Describe the working solution

In Alteryx we found the solution to our problem. Alteryx was utilized to join data sources of in different data formats and environments gathered from different departments including Sales, Finance, Operations/Supply Chain, and Human Resources.

 

  • The Sales department provides Forecast in an Excel worksheet. As the worksheet is being accessed and edited by more than 500 individuals, data inconsistency between fields (such as time dimension) is an ongoing issue and data architecture needs to be re-organized and consolidated.
  • The Finance department provides Billings in the format of Oracle Hyperion, where there are data inconsistencies between Billings and Backlog & Bookings due to system differences. Billings need to be merged with Backlog & Bookings to identify EOL parts for commissions and forecast are identified.
  • The Operations/Supply Chain department provides Backlog & Bookings through SAP, which also has data inconsistencies between Backlog & Bookings and Billings due to system differences. Backlog and Bookings need to be merged with Billings, and EOL parts for commission and forecast are identified.
  • The HR department provides Organization Hierarchy through SAP HANA, in order to apply a row level security on the dashboards later on.

 

To resolve the issues, all relevant data is structured and follows the overall defined data architecture described in Alteryx. First, Alteryx pulls relevant data from various sources and stores it in a shared drive/folder. Then, Alteryx runs its algorithms based on our definitions. A special script was developed to publish and trigger a refresh of the dashboard with the latest data on a daily basis. Finally, a notification via email is sent to all the users (more than 500) with a hyperlink, once the refreshed data is published.

 

Workflow.png

 

Describe the benefits you have achieved

Prior to the Alteryx implementation, a lot of time was spent downloading, storing, and consolidating the files, which resulted in multiple unexpected errors which were hard to identify. The accuracy and confidence level of the manually created dashboard was not very high, due to the unexpected human errors. Very often, the dashboards required so much preparation that by the time they were published they were already outdated.

 

Through the Alteryx approach, we have now eliminated manual intervention and reduced the effort to prepare and publish/distribute the reports to less than 1% compared to previous approach. In addition, through this streamlined approach we have stimulated collaboration on a global basis.

 

Departments such as IT, Finance, Sales are able to work much tighter together as they are seeing results within an extremely short period of time.

The other advantage of this solution is that it is now broadly being used throughout the organization from the CEO to analysts based on the defined security model.

 

Running_Time.pngHow much time has your organization saved by using Alteryx workflows?

It used to take us one week to create and develop the workflow. The biggest challenge we faced was to determine the individual steps and the responsible person as various resources and departments were required to contribute.

 

Through Alteryx workflow we are able to save more than 15 hours per week in data merging alone and at the same time we are now able to publish the reports/analysis on a daily basis. Through Alteryx we are now saving over 75h from various departments to run the process from end-to-end on a daily basis.

 

What has this time savings allowed you to do?

Through automating the process we received a lot of management attention and a desire to create more automated and on-demand dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

Author: Kiran Ramakrishnan

 

Awards Category: Most Time Saved 

 

Through automating processes we received a lot of management attention and a desire to create more automated and on-demand analysis, dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

 

BBB_definitions.PNG

 

Describe the problem you needed to solve 

We are a semiconductor company located in the Silicon Valley. We are in business for more than 30 years with 45 locations globally and about 5000 employees. We are in business to solve our customers' challenges. We are a leader in driving innovations in particular for Microcontrollers. The company focuses on markets embedded processing, security, wireless, and touch technologies. In Automotive we provide solutions beyond touch such Remote keyless or networking. Our emphasize is IoT applications. We see a potential in the Internet of Things market combining our products especially MCUs, Security and Wireless Technologies.

 

In this industry, planning is essential as the market is very dynamic and volatile but manufacturing cycles are long. Most electronic applications have comparatively short product life cycles and sharp production ramp cycles. Ignoring these ramps could result in over/under capacity. For a semiconductor company it is key to clearly understand these dynamics and take appropriate actions within an acceptable time.

 

To forecast and make appropriate predictions, organizations need critical information such as actual forecast, billing, backlog and bookings. Based on this information Sales, BUs and Finance are able to build models. As End of Life parts convert immediately into revenue we need to treat them separately. Typically semiconductors sales is based on sales commission. Sales commissions are calculated on product category and type. Therefore each line item needs to be matched to a salesperson by product life cycle. In public companies this is done on a quarterly basis and regular updates increase an organization's confidence to achieve set goals. As electronic companies are demanding more and more security levels to data access, consolidated dataset needs to be protected to ensure compliance with customer agreements. Large organizations also require data security to ensure data is only accessible on a need-to-know basis.

 

user_guide.gif

Historically, people from these different groups manually created, cleansed and merged data and information into various files and sources to achieve insight. It is common to use different environments such as Oracle DBs, SAP, ModelN, SharePoint, Salesforce, Excel, and Access. This is extremely time consuming and requires a huge manual effort. Usually data consistency between different sources is not guaranteed and requires additional cleansing and manipulation. As every person/group has also their own way to gathering and consolidating this information it typically leads to different results and layout as it is hard for someone outside the group to clearly understand the other person's approach. These reports are regularly necessary a necessity and need to be complied on a weekly/daily basis on the refresh frequencies.  We also want to get independent of resources to update dashboards on demand. Current process makes the reporting heavily reliant on human resources.

 

Describe the working solution

In Alteryx we found the solution to our problem. Alteryx was utilized to join data sources of in different data formats and environments gathered from different departments including Sales, Finance, Operations/Supply Chain, and Human Resources.

 

  • The Sales department provides Forecast in an Excel worksheet. As the worksheet is being accessed and edited by more than 500 individuals, data inconsistency between fields (such as time dimension) is an ongoing issue and data architecture needs to be re-organized and consolidated.
  • The Finance department provides Billings in the format of Oracle Hyperion, where there are data inconsistencies between Billings and Backlog & Bookings due to system differences. Billings need to be merged with Backlog & Bookings to identify EOL parts for commissions and forecast are identified.
  • The Operations/Supply Chain department provides Backlog & Bookings through SAP, which also has data inconsistencies between Backlog & Bookings and Billings due to system differences. Backlog and Bookings need to be merged with Billings, and EOL parts for commission and forecast are identified.
  • The HR department provides Organization Hierarchy through SAP HANA, in order to apply a row level security on the dashboards later on.

 

To resolve the issues, all relevant data is structured and follows the overall defined data architecture described in Alteryx. First, Alteryx pulls relevant data from various sources and stores it in a shared drive/folder. Then, Alteryx runs its algorithms based on our definitions. A special script was developed to publish and trigger a refresh of the dashboard with the latest data on a daily basis. Finally, a notification via email is sent to all the users (more than 500) with a hyperlink, once the refreshed data is published.

 

Workflow.png

 

Describe the benefits you have achieved

Prior to the Alteryx implementation, a lot of time was spent downloading, storing, and consolidating the files, which resulted in multiple unexpected errors which were hard to identify. The accuracy and confidence level of the manually created dashboard was not very high, due to the unexpected human errors. Very often, the dashboards required so much preparation that by the time they were published they were already outdated.

 

Through the Alteryx approach, we have now eliminated manual intervention and reduced the effort to prepare and publish/distribute the reports to less than 1% compared to previous approach. In addition, through this streamlined approach we have stimulated collaboration on a global basis.

 

Departments such as IT, Finance, Sales are able to work much tighter together as they are seeing results within an extremely short period of time.

The other advantage of this solution is that it is now broadly being used throughout the organization from the CEO to analysts based on the defined security model.

 

Running_Time.pngHow much time has your organization saved by using Alteryx workflows?

It used to take us one week to create and develop the workflow. The biggest challenge we faced was to determine the individual steps and the responsible person as various resources and departments were required to contribute.

 

Through Alteryx workflow we are able to save more than 15 hours per week in data merging alone and at the same time we are now able to publish the reports/analysis on a daily basis. Through Alteryx we are now saving over 75h from various departments to run the process from end-to-end on a daily basis.

 

What has this time savings allowed you to do?

Through automating the process we received a lot of management attention and a desire to create more automated and on-demand dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

Author: Omid Madadi, Developer

Company: Southwest Airlines Co.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Fuel consumption expense is a major challenge for the airline industry. According to the International Air Transport Association, fuel represented 27% of the total operating costs for major airlines in 2015. For this reason, most airlines attempt to improve their operational efficiency in order to stay competitive and increase revenue. One way to improve operational efficiency is to increase the accuracy of fuel consumption forecasting.

 

Currently, Southwest Airlines offers services in 97 destinations with an average of 3,500 flights a day. Not having enough fuel at an airport is extremely costly and may result in disrupting flights. Conversely, ordering more fuel than what an airport needs results in high inventory and storage costs. As such, the objective of this project was to develop proper forecasting models and methods for each of these 97 airports in order to increase the accuracy and speed of fuel consumption by using historical monthly consumption data.

 

Describe the working solution

Data utilized in this project were from historical Southwest Airlines monthly fuel consumption reports. Datasets were gathered from each of the 97 airports as well as various Southwest departments, such as finance and network planning. Forecasting was performed on four different categories: scheduled flights consumption, non-scheduled flights consumption, alternate fuel, and tankering fuel. Ultimately, the total consumption for each airport was obtained by aggregating these four categories. Since data were monthly, time series forecasting and statistical models - such as autoregressive integrated moving average (ARIMA), time series linear and non-linear regression, and exponential smoothing - were used to predict future consumptions based on previously observed consumptions. To select the best forecasting model, an algorithm was developed to compare various statistical model accuracies. This selects a statistical model that is best fit for each category and each airport. Ultimately, this model will be used every month by the Southwest Airlines Fuel Department.

 

Capture3.PNG

 

In addition to developing a consumption forecast that increases fuel efficiency, a web application was also developed. This web application enables the Fuel Department to browse input data files, upload them, and then run the application in an easy, efficient, and effortless manner. Data visualization tools were also added to provide the Fuel Department with better insights of trends and seasonality. Development of the statistical models has been finalized and will be pushed to production for use by the Southwest Airlines Fuel Department soon.

 

Capture4.PNG

 

Capture6.PNG

 

Describe the benefits you have achieved

Initially, the forecasting process for all 97 Southwest Airlines airports used to be conducted through approximately 150 Excel spreadsheets. However, this was an extremely difficult, time-consuming, and disorganized process. Normally, consumption forecasts would take up to three days and would have to be performed manually. Furthermore, accuracy was unsatisfactory since Excel's capabilities are inadequate in terms of statistical and mathematical modeling.

 

For these reasons, a decision was made to use CRAN R and Alteryx for data processing and development of the forecasting models. Alteryx offers many benefits since it allows executing R language script by using R-Tool. Moreover, Alteryx makes data preparations, manipulations, processing, and analysis fast and efficient for large datasets. Multiple data sources and various data types have been used in the design workflow. Nonetheless, Alteryx made it convenient to select and filter input data, as well as join data from multiple tables and file types. In addition, the Fuel Department needed a web application that would allow multiple users to run the consumption forecast without the help of any developers, and Alteryx was a simple solution to the Fuel Department's needs since it developed an interface and published the design workflow to a web application (through the Southwest Airlines' gallery).

 

In general, the benefits of the consumption forecast include (but are not limited to) the following:

 

  • The forecasting accuracy improved approximately 70% for non-schedule flights and 12% for scheduled flight, which results in considerable fuel cost saving for the Southwest Airlines.
  • The current execution time reduced dramatically from 3 days to 10 minutes. Developers are working to reduce this time even more.
  • The consumption forecast provides a 12-month forecasting horizon for the Fuel Department. Due to the complexity of the process, this could not be conducted previously using Excel spreadsheets.
  • The Fuel Department is able to identify seasonality and estimate trends at each airport. This provides invaluable insights for decision-makers on the fuel consumption at each airport.
  • The consumption forecast identifies and flags outliers and problematic airports and enables decision-makers to be prepared against unexpected conditions.

Author: Omid Madadi, Developer

Company: Southwest Airlines Co.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Fuel consumption expense is a major challenge for the airline industry. According to the International Air Transport Association, fuel represented 27% of the total operating costs for major airlines in 2015. For this reason, most airlines attempt to improve their operational efficiency in order to stay competitive and increase revenue. One way to improve operational efficiency is to increase the accuracy of fuel consumption forecasting.

 

Currently, Southwest Airlines offers services in 97 destinations with an average of 3,500 flights a day. Not having enough fuel at an airport is extremely costly and may result in disrupting flights. Conversely, ordering more fuel than what an airport needs results in high inventory and storage costs. As such, the objective of this project was to develop proper forecasting models and methods for each of these 97 airports in order to increase the accuracy and speed of fuel consumption by using historical monthly consumption data.

 

Describe the working solution

Data utilized in this project were from historical Southwest Airlines monthly fuel consumption reports. Datasets were gathered from each of the 97 airports as well as various Southwest departments, such as finance and network planning. Forecasting was performed on four different categories: scheduled flights consumption, non-scheduled flights consumption, alternate fuel, and tankering fuel. Ultimately, the total consumption for each airport was obtained by aggregating these four categories. Since data were monthly, time series forecasting and statistical models - such as autoregressive integrated moving average (ARIMA), time series linear and non-linear regression, and exponential smoothing - were used to predict future consumptions based on previously observed consumptions. To select the best forecasting model, an algorithm was developed to compare various statistical model accuracies. This selects a statistical model that is best fit for each category and each airport. Ultimately, this model will be used every month by the Southwest Airlines Fuel Department.

 

Capture3.PNG

 

In addition to developing a consumption forecast that increases fuel efficiency, a web application was also developed. This web application enables the Fuel Department to browse input data files, upload them, and then run the application in an easy, efficient, and effortless manner. Data visualization tools were also added to provide the Fuel Department with better insights of trends and seasonality. Development of the statistical models has been finalized and will be pushed to production for use by the Southwest Airlines Fuel Department soon.

 

Capture4.PNG

 

Capture6.PNG

 

Describe the benefits you have achieved

Initially, the forecasting process for all 97 Southwest Airlines airports used to be conducted through approximately 150 Excel spreadsheets. However, this was an extremely difficult, time-consuming, and disorganized process. Normally, consumption forecasts would take up to three days and would have to be performed manually. Furthermore, accuracy was unsatisfactory since Excel's capabilities are inadequate in terms of statistical and mathematical modeling.

 

For these reasons, a decision was made to use CRAN R and Alteryx for data processing and development of the forecasting models. Alteryx offers many benefits since it allows executing R language script by using R-Tool. Moreover, Alteryx makes data preparations, manipulations, processing, and analysis fast and efficient for large datasets. Multiple data sources and various data types have been used in the design workflow. Nonetheless, Alteryx made it convenient to select and filter input data, as well as join data from multiple tables and file types. In addition, the Fuel Department needed a web application that would allow multiple users to run the consumption forecast without the help of any developers, and Alteryx was a simple solution to the Fuel Department's needs since it developed an interface and published the design workflow to a web application (through the Southwest Airlines' gallery).

 

In general, the benefits of the consumption forecast include (but are not limited to) the following:

 

  • The forecasting accuracy improved approximately 70% for non-schedule flights and 12% for scheduled flight, which results in considerable fuel cost saving for the Southwest Airlines.
  • The current execution time reduced dramatically from 3 days to 10 minutes. Developers are working to reduce this time even more.
  • The consumption forecast provides a 12-month forecasting horizon for the Fuel Department. Due to the complexity of the process, this could not be conducted previously using Excel spreadsheets.
  • The Fuel Department is able to identify seasonality and estimate trends at each airport. This provides invaluable insights for decision-makers on the fuel consumption at each airport.
  • The consumption forecast identifies and flags outliers and problematic airports and enables decision-makers to be prepared against unexpected conditions.

Author: Mark Frisch (@MarqueeCrew), CEO

Company: MarqueeCrew

 

Awards Category: Name Your Own - Macros for the Good of All Alteryx Users

 

Describe the problem you needed to solve 

Creation of samples goes beyond random and creating N'ths.  It is crucial that samples be representative of their source populations if you are going to draw any meaningful truth from your marketing or other use cases.  After creating a sample set, how would you verify that you didn't select too many of one segment vs another?  If you're using Mosaic (r) data and there are 71 types to consider did you get enough of each type?

 

image004.png

 

Describe the working solution

Using a chi-squared test, we created a macro and published the macro to the Alteryx Macro District as well as to the CReW macros  (www.chaosreignswithin).  There are two input anchors (Population and Sample) and the configuration requires that you select a categorical variable from both inputs (the same variable content).  The output is a report that tells you if your representative or not (includes degrees of freedom and the Chi square results against a 95% confidence interval).

 

image005.jpg

 

Describe the benefits you have achieved

My client was able to avoid the costly mistake that had plagued their prior marketing initiative and was setup for success.  I wanted to share this feature with the community.  It would be awesome if it ended up helping my charity, the American Cancer Society.  Although this isn't quite as sexy as my competition, it is sexy in it's simplicity and geek factor.

 

image006.jpg

Author: Mark Frisch (@MarqueeCrew), CEO

Company: MarqueeCrew

 

Awards Category: Name Your Own - Macros for the Good of All Alteryx Users

 

Describe the problem you needed to solve 

Creation of samples goes beyond random and creating N'ths.  It is crucial that samples be representative of their source populations if you are going to draw any meaningful truth from your marketing or other use cases.  After creating a sample set, how would you verify that you didn't select too many of one segment vs another?  If you're using Mosaic (r) data and there are 71 types to consider did you get enough of each type?

 

image004.png

 

Describe the working solution

Using a chi-squared test, we created a macro and published the macro to the Alteryx Macro District as well as to the CReW macros  (www.chaosreignswithin).  There are two input anchors (Population and Sample) and the configuration requires that you select a categorical variable from both inputs (the same variable content).  The output is a report that tells you if your representative or not (includes degrees of freedom and the Chi square results against a 95% confidence interval).

 

image005.jpg

 

Describe the benefits you have achieved

My client was able to avoid the costly mistake that had plagued their prior marketing initiative and was setup for success.  I wanted to share this feature with the community.  It would be awesome if it ended up helping my charity, the American Cancer Society.  Although this isn't quite as sexy as my competition, it is sexy in it's simplicity and geek factor.

 

image006.jpg

QralGroupLogo.jpgAuthor: Ryan Bruskiewicz (@rbruskiewicz) - Management Consultant

Company: Qral Group

 

Awards Category: Best Use of Alteryx for Spatial Analytics

 

I am using spatial analytics in Alteryx, in combination with healthcare utilization data for drugs and procedures published by Centers for Medicare & Medicaid Services (CMS) and shape files on data.gov (ZCTA, US primary roads) to optimize geographic territory alignment for sales representatives in the life sciences industry.

 

The process leverages several spatial analytics tools in Alteryx, including Distance/Driving Distance, Find Nearest, Create Points, SpatialObjCombine in Summarize Tool, Spatial Match, and Location Optimization Macros. The Alignment Optimization workflow outputs data files for visual mapping, analysis and summary reporting in Tableau, and outputs files to a tool called TerritoryMapper for manual refinement of territory zip code boundaries.

 

Describe the problem you needed to solve

The Initial Business Problem

The business problem solved is optimization of sales force territory alignments. The objective is to create territories that are balanced in terms of workload and sales potential and consider geographic constraints and travel time. The approach developed in Alteryx efficiently optimizes geographic alignments in a matter of minutes without costly purchases of third party data sources:

  • Traditional approaches to this business problem in the life sciences typically require purchase of third party healthcare drug/procedures utilization data for specific therapeutics areas or markets that represent a significant investment. Leveraging data published by CMS, we were able build a completely flexible model that can optimize territory alignment design for any market basket (i.e. any combination of drugs or healthcare procedures) without purchasing additional data.
  • Traditional approaches also typically require weeks or months to complete (as opposed to minutes or hours with our approach in Alteryx). The approach eliminates time spent purchasing/acquiring data, loading data, preparing/summarizing data, loading data into an alignment tool, manually defining and refining territory boundaries, and summarizing alignment results – these steps are fully-automated with our Alteryx workflow.

Additional use cases solved
Leveraging the alignment optimization workflow as an initial platform, solutions to address other related business problems have been developed and integrated into the workflow:

  • Market sizing & value concentration curve
  • Physician segmentation by patient volume & specialty
  • Sales force sizing to determine optimal # of sales representatives for a given targeting strategy
  • Territory-level call plans to physicians/accounts

Together, this set of solutions provides a suite of tools to automate and optimize field sales force deployment.

 

Drivers and applications
These solutions are used on projects to support Business Development and/or Commercial Operations teams within life sciences companies. The alignment optimization workflow has also been generalized to enable Sales Operations teams in any industry to design sales territories by taking an individual company’s customer target list and demand/sales history as an input.

 

Internally, Qral Group has leveraged this tool to create territory alignments for many combinations of sales force size and therapeutic area. With this broad set of alignment scenarios, we can quantify how much of territory alignment is effectively “objective” due to population distribution vs. variable for specific therapeutic areas due to regional differences in disease incidence and prevalence. We found that there is typically an 80-85% overlap of territory alignment, regardless of therapeutic area!

 

Optimal Territory Alignment by Therapeutic Area and Sales Force SizeQral1.png

 

 

Describe the working solution

The working solution integrates ~12 GB of data representing ~12 billion healthcare claims from the following data sources:

  • Flat files (.csv)
    • Healthcare provider universe
    • Medicare provider utilization & payment data for inpatient procedures, outpatient procedures, Part D prescribers, and Part B services
    • Demographic data for population by age, gender by ZIP Code
  • Shape files (.shp)
    • S. primary roads geodatabase
    • Cartographic boundary shape data for ZIP Code Tabulation Areas (ZCTA)

The alignment optimization tool consists of two repeatable workflows.

  • The first workflow integrates the data sources described above into a database that can be consumed by the alignment optimization algorithm, and is only run when raw data sources need to be updated (typically twice per year)
  • The second workflow connects to an excel-based user input form and leverages several custom-built Alteryx macros to execute the geographic territory alignment optimization. This workflow is run on-demand to create the territory alignment output.

The excel user input form allows a user to specify:

  • Drugs, services, and provider types to be considered markers for sales representative workload
  • Planned workload/call frequency by provider segment
  • Target workload range (min/max) for each territory for workload balancing optimization
  • Number of sales representatives, number of first-line sales managers, and number of second-line sales managers

The optimization workflow also includes macros to accomplish certain complex operations, batch processes, iterative processes, and optimizations, such as:

  • Clustering algorithm macro using native R-based clustering tool to identify territory workload centers
  • Adjustment of territory alignment to consider geographic constraints, such as US primary roads, driving time, and state boundaries
  • Batch macro to split heavy geographies into equal territories
  • Location optimization macro to optimally rebalance workload across neighboring territories by reassigning ZIP codes
  • Iterative macro to define locations for 1st line and 2nd line managers, and determine sales force hierarchy (assignment of reps to districts and regions)

The ZIP-Territory alignment and sales force hierarchy is output to Tableau to visualize the geographic alignment and report on summary statistics (e.g. number of customers, sales, workload) for each territory and span of control for managers.

 

Describe the benefits you have achieved

Alteryx has had a significant impact on our consulting project work through time savings and cost reduction.  For this geographic territory alignment use case, specifically:

  • Time savings: Reduced time needed to complete analysis from 1-2 months to 1 day (savings across each analysis step from data acquisition, loading, preparation, alignment optimization, and results visualization)
  • Cost savings: For clients who do not already own prescriber-level data, reduced cost by eliminating need for a costly one-time data purchase from third party vendors to complete territory alignment analysis.

At Qral Group, we have realized many benefits of leveraging Alteryx Designer for other use cases as well.  We regularly analyze large volumes of healthcare claims data.  We can use Alteryx to process hundreds of millions and billions of records without heavy investment in infrastructure.  This capability allows to better understand treatment pathways and patient journey, patient utilization, referral networks, physician and patient segmentation, payer cost impacts, and patient identification for rare diseases.