community
cancel
Showing results for 
Search instead for 
Did you mean: 

Past Analytics Excellence Awards

Suggest an idea

Author: Andy Kriebel (@VizWizBI), Head Coach

Company: The Information Lab

 

Awards Category: Best 'Alteryx for Good' Story

 

The Connect2Help 211 team outlined their requirements, including review the database structure and what they were looking for as outputs. Note that this was also the week that we introduced Data School to Alteryx. We knew that the team could use Alteryx to prepare, cleanse and analyse the data. Ultimately, the team wanted to create a workflow in Alteryx that Connect2Help 211 could use in the future.

 

Ann Hartman, Director of Connect2Help 211 summarized the impact best: "We were absolutely blown away by your presentation today. This is proof that a small group of dedicated people working together can change an entire community. With the Alteryx workflow and Tableau workbooks you created, we can show the community what is needed where, and how people can help in their communities."

 

The entire details of the project can be best found here - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the problem you needed to solve 

In July 2015, Connect2Help 211, an Indianapolis-based non-profit service that facilitates connections between people who need human services and those who provide them, reached out to the Tableau Zen Masters as part of a broader effort that the Zens participate in for the Tableau Foundation. Their goals and needs were simple: Create an ETL process that extracts Refer data, transforms it, and loads it into a MYSQL database that can be connected to Tableau.

 

Describe the working solution

Alteryx-Workflow-211.png

 

See the workflow and further details in the blog post - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the benefits you have achieved

While the workflow looks amazingly complex, it absolutely accomplished the goal of creating a reusable ETL workflow. Ben Moss kicked off the project presentations by taking the Connect2Help 211 team through what the team had to do and how Connect2Help 211 could use this workflow going forward.

 

From there, the team went through the eight different visualisation that they created in Tableau. Keep in mind, Connect2Help 211 wasn't expecting any visualisations as part of the output, so to say they were excited with what the team created in just a week is a massive understatement.

 

Anuka.png

Author: Andy Kriebel (@VizWizBI), Head Coach

Company: The Information Lab

 

Awards Category: Best 'Alteryx for Good' Story

 

The Connect2Help 211 team outlined their requirements, including review the database structure and what they were looking for as outputs. Note that this was also the week that we introduced Data School to Alteryx. We knew that the team could use Alteryx to prepare, cleanse and analyse the data. Ultimately, the team wanted to create a workflow in Alteryx that Connect2Help 211 could use in the future.

 

Ann Hartman, Director of Connect2Help 211 summarized the impact best: "We were absolutely blown away by your presentation today. This is proof that a small group of dedicated people working together can change an entire community. With the Alteryx workflow and Tableau workbooks you created, we can show the community what is needed where, and how people can help in their communities."

 

The entire details of the project can be best found here - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the problem you needed to solve 

In July 2015, Connect2Help 211, an Indianapolis-based non-profit service that facilitates connections between people who need human services and those who provide them, reached out to the Tableau Zen Masters as part of a broader effort that the Zens participate in for the Tableau Foundation. Their goals and needs were simple: Create an ETL process that extracts Refer data, transforms it, and loads it into a MYSQL database that can be connected to Tableau.

 

Describe the working solution

Alteryx-Workflow-211.png

 

See the workflow and further details in the blog post - http://www.thedataschool.co.uk/andy-kriebel/connect2help211/

 

Describe the benefits you have achieved

While the workflow looks amazingly complex, it absolutely accomplished the goal of creating a reusable ETL workflow. Ben Moss kicked off the project presentations by taking the Connect2Help 211 team through what the team had to do and how Connect2Help 211 could use this workflow going forward.

 

From there, the team went through the eight different visualisation that they created in Tableau. Keep in mind, Connect2Help 211 wasn't expecting any visualisations as part of the output, so to say they were excited with what the team created in just a week is a massive understatement.

 

Anuka.png

Author: Alexandra Wiegel, Tax Business Intelligence Analyst In-2C-14px.png
Company: Comcast Corp


Awards Category: Best Business ROI

 

A Corporate Tax Department is not typically associated with a Business Intelligence team sleekly manipulating and mining large data sources for insights.  Alteryx has allowed our Tax Business Intelligence team to provide incredibly useful insight to several branches of our larger Tax Department. Today, almost all of our data is in Excel or csv format and so data organization, manipulation and analysis have previously been accomplished within the confines of Excel, with the occasional Tableau for visualization. Alteryx has given us the ability to analyze, organize, and manipulate very large amounts of data from multiple sources.  Alteryx is exactly what we need to solve our colleague’s problems.


Describe the problem you needed to solve

Several weeks ago we were approached about using Alteryx to do a discovery project that would hopefully provide our colleagues further insight into the application of tax codes to customer bills. Currently, our Sales Tax Team uses two different methods to apply taxes to two of our main products respectively. The first method is to apply Tax Codes to customer bill records and then run those codes through software that generates and applies taxes to each record. The second method is more home-grown and appears to be leading to less consistent taxability on this side of our business.

 

Given that we sell services across the entire country, we wanted to explore standardization across all our markets. So, our Sales Tax team tasked us with creating a workflow that would compare the two different methods and develop a plan towards the goal of standardization and the effect it would have on every customer’s bills.

 

Describe the working solution

Our original source file was a customer level report where the records were each item (products, fees, taxes, etc.) on a customer’s bill for every customer in a given location. As it goes with data projects, our first task was to cleanse, organize, and append the data to make it uniform.

 

21.PNG

 

The next step was to add in the data from several data sources that we would ultimately need in order to show the different buckets of customers according to the monetary changes of their bills. Since these sources were all formatted differently and there was often no unique identifier we could use to join new data sources to our original report. Hence, we had to create a method to ensure we did not create duplicate records when using the join function. We ended up using this process multiple times (pictured below)

 

22.PNG

 

And so, the workflow followed. We added tax descriptions, new codes, and other information. We added calculated fields to determine the amount of tax that should be owed by each customer today, based on our current coding methods.

 

23.PNG

24.PNG

25.PNG

26.PNG

26.PNG

 

After we had layered in all the extra data that we would need to create our buckets, we distinguished between the two lines of business and add in the logic to determine which codes, at present, are taxable.

 

28.PNG

 

For the side of our business whose taxability is determine by software, you will notice that the logic is relatively simple. We added in our tax codes using the same joining method as we did above and then used a single join to a table that lists the taxable codes.

 

29.PNG

 

For the side of our business whose taxability is determine by using our home-grown method, you can see below that the logic is more complicated. Currently, the tax codes for this line of business are listed in such a way that requires us to parse a field and stack the resulting records in order to isolate individual codes. Once we have done this we can then apply the taxability portion. We then have to use this as a lookup for the actual record in order to determine if a record contains within the code column a tax code that has been marked as taxable. Or in other words, to apply our home-grown taxability logic is complicated, time consuming, and leaves much room for error.

 

210.PNG

 

Once we stacked all this data back together we joined it with the new tax code table. This will give us the new codes so that the software can be used for both lines of business. Once we know these new codes, we can simulate the process of the software and determine which of the new codes will be taxable.

 

211.PNG

 

Knowing whether or not codes are taxable helps us hypothesize about how problematic a geographic location may end up being for our team, but it does not tell us the dollar amount of taxes that will be changing. To know this we must output files that will be run through the real software.

 

Hence, once we have completed the above data manipulation, cleansing, and organization, we extract the data that we want to have run through the software and reformat the records to match the necessary format for the software recognition.

 

212.PNG

213.PNG

 

We created the above two macros to reformat the columns in order to simply this extensive workflow. Pictured below is the top macro. The difference between the two resides in the first select tool where we have specified different fields to be output.

 

214.PNG

 

After the reformatting, we output the files and send them to the software team.

 

215.PNG

216.PNG

 

When the data is returned to us, we will be able to determine the current amount of tax that is being charged to each customer as well the amount that will be charged once the codes are remapped. The difference between these two will then become our buckets of customers and our Vice President can begin to understand how the code changes will affect our customer’s bills.

 

Describe the benefits you have achieved

Although this project took several weeks to build in Alteryx, it was well worth the time invested as we will be able to utilize it for any other locations. We have gained incredible efficiency in acquiring insight on this standardization project using Alteryx. Another benefit we have seen in Alteryx is the flexibility to make minor changes to our workflow which has helped us easily customize for different locations. All of the various Alteryx tools have made it possible for the Tax Business Intelligence team to assist the Tax Department in accomplishing large data discovery projects such as this.

 

Further, we have begun creating an Alteryx app that can be run by anyone in our Tax Department. This frees up the Tax Business Intelligence team to work on other important projects that are high priority.

A common benefit theme amongst Alteryx users is that Alteryx workflows save companies large amounts of time in data manipulation and organization. Moreover, Alteryx has made it possible (where it is impossible in Excel) to handle large and complicated amounts of data and in a very user friendly environment. Alteryx will continue to be a very valuable tool which the Tax Business Intelligence team will use to help transform the Tax department into a more efficient, more powerful, and more unified organization in the coming years.

 

How much time has your organization saved by using Alteryx workflows?

We could never have done this data discovery project without using Alteryx.  It was impossible to create any process within Excel given the quantity and complexity of the data.

 

In other projects, we are able to replicate Excel reconciliation processes that are run annually, quarterly, and monthly in Alteryx.  The Alteryx workflows have saved our Tax Department weeks of manual Excel pivot table work.  Time savings on individual projects can range from a few hours to several weeks.

 

What has this time savings allowed you to do?

The time savings has been invaluable.  The Tax Department staff are now able to free themselves of the repetitive tasks in Excel, obtain more accurate results and spend time doing analysis and understanding the results of the data.  The “smarter” time spent to do analyses will help transform the Tax Department with greater opportunities to further add value to the company.

Author: Alexandra Wiegel, Tax Business Intelligence Analyst In-2C-14px.png
Company: Comcast Corp


Awards Category: Best Business ROI

 

A Corporate Tax Department is not typically associated with a Business Intelligence team sleekly manipulating and mining large data sources for insights.  Alteryx has allowed our Tax Business Intelligence team to provide incredibly useful insight to several branches of our larger Tax Department. Today, almost all of our data is in Excel or csv format and so data organization, manipulation and analysis have previously been accomplished within the confines of Excel, with the occasional Tableau for visualization. Alteryx has given us the ability to analyze, organize, and manipulate very large amounts of data from multiple sources.  Alteryx is exactly what we need to solve our colleague’s problems.


Describe the problem you needed to solve

Several weeks ago we were approached about using Alteryx to do a discovery project that would hopefully provide our colleagues further insight into the application of tax codes to customer bills. Currently, our Sales Tax Team uses two different methods to apply taxes to two of our main products respectively. The first method is to apply Tax Codes to customer bill records and then run those codes through software that generates and applies taxes to each record. The second method is more home-grown and appears to be leading to less consistent taxability on this side of our business.

 

Given that we sell services across the entire country, we wanted to explore standardization across all our markets. So, our Sales Tax team tasked us with creating a workflow that would compare the two different methods and develop a plan towards the goal of standardization and the effect it would have on every customer’s bills.

 

Describe the working solution

Our original source file was a customer level report where the records were each item (products, fees, taxes, etc.) on a customer’s bill for every customer in a given location. As it goes with data projects, our first task was to cleanse, organize, and append the data to make it uniform.

 

21.PNG

 

The next step was to add in the data from several data sources that we would ultimately need in order to show the different buckets of customers according to the monetary changes of their bills. Since these sources were all formatted differently and there was often no unique identifier we could use to join new data sources to our original report. Hence, we had to create a method to ensure we did not create duplicate records when using the join function. We ended up using this process multiple times (pictured below)

 

22.PNG

 

And so, the workflow followed. We added tax descriptions, new codes, and other information. We added calculated fields to determine the amount of tax that should be owed by each customer today, based on our current coding methods.

 

23.PNG

24.PNG

25.PNG

26.PNG

26.PNG

 

After we had layered in all the extra data that we would need to create our buckets, we distinguished between the two lines of business and add in the logic to determine which codes, at present, are taxable.

 

28.PNG

 

For the side of our business whose taxability is determine by software, you will notice that the logic is relatively simple. We added in our tax codes using the same joining method as we did above and then used a single join to a table that lists the taxable codes.

 

29.PNG

 

For the side of our business whose taxability is determine by using our home-grown method, you can see below that the logic is more complicated. Currently, the tax codes for this line of business are listed in such a way that requires us to parse a field and stack the resulting records in order to isolate individual codes. Once we have done this we can then apply the taxability portion. We then have to use this as a lookup for the actual record in order to determine if a record contains within the code column a tax code that has been marked as taxable. Or in other words, to apply our home-grown taxability logic is complicated, time consuming, and leaves much room for error.

 

210.PNG

 

Once we stacked all this data back together we joined it with the new tax code table. This will give us the new codes so that the software can be used for both lines of business. Once we know these new codes, we can simulate the process of the software and determine which of the new codes will be taxable.

 

211.PNG

 

Knowing whether or not codes are taxable helps us hypothesize about how problematic a geographic location may end up being for our team, but it does not tell us the dollar amount of taxes that will be changing. To know this we must output files that will be run through the real software.

 

Hence, once we have completed the above data manipulation, cleansing, and organization, we extract the data that we want to have run through the software and reformat the records to match the necessary format for the software recognition.

 

212.PNG

213.PNG

 

We created the above two macros to reformat the columns in order to simply this extensive workflow. Pictured below is the top macro. The difference between the two resides in the first select tool where we have specified different fields to be output.

 

214.PNG

 

After the reformatting, we output the files and send them to the software team.

 

215.PNG

216.PNG

 

When the data is returned to us, we will be able to determine the current amount of tax that is being charged to each customer as well the amount that will be charged once the codes are remapped. The difference between these two will then become our buckets of customers and our Vice President can begin to understand how the code changes will affect our customer’s bills.

 

Describe the benefits you have achieved

Although this project took several weeks to build in Alteryx, it was well worth the time invested as we will be able to utilize it for any other locations. We have gained incredible efficiency in acquiring insight on this standardization project using Alteryx. Another benefit we have seen in Alteryx is the flexibility to make minor changes to our workflow which has helped us easily customize for different locations. All of the various Alteryx tools have made it possible for the Tax Business Intelligence team to assist the Tax Department in accomplishing large data discovery projects such as this.

 

Further, we have begun creating an Alteryx app that can be run by anyone in our Tax Department. This frees up the Tax Business Intelligence team to work on other important projects that are high priority.

A common benefit theme amongst Alteryx users is that Alteryx workflows save companies large amounts of time in data manipulation and organization. Moreover, Alteryx has made it possible (where it is impossible in Excel) to handle large and complicated amounts of data and in a very user friendly environment. Alteryx will continue to be a very valuable tool which the Tax Business Intelligence team will use to help transform the Tax department into a more efficient, more powerful, and more unified organization in the coming years.

 

How much time has your organization saved by using Alteryx workflows?

We could never have done this data discovery project without using Alteryx.  It was impossible to create any process within Excel given the quantity and complexity of the data.

 

In other projects, we are able to replicate Excel reconciliation processes that are run annually, quarterly, and monthly in Alteryx.  The Alteryx workflows have saved our Tax Department weeks of manual Excel pivot table work.  Time savings on individual projects can range from a few hours to several weeks.

 

What has this time savings allowed you to do?

The time savings has been invaluable.  The Tax Department staff are now able to free themselves of the repetitive tasks in Excel, obtain more accurate results and spend time doing analysis and understanding the results of the data.  The “smarter” time spent to do analyses will help transform the Tax Department with greater opportunities to further add value to the company.

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

Audit Automation.PNG

 

Billing:

 

Billing.PNG

 

File Generator:

 

File Generator.PNG

Market Generator:

 

Market Data.PNG

 

Parse:

Parse.PNG

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Jennifer Jensen, Sr. Analyst In-2CRev-28px-R.pngand team members Inna Meerovich, RJ Summers

Company: mcgarrybowen 

 

mcgarrybowen is a creative advertising agency that is in the transformation business. From the beginning, mcgarrybowen was built differently, on the simple premise that clients deserve better. So we built a company committed to delivering just that. A company that believes, with every fiber of its being, that it exists to serve clients, build brands, and grow businesses.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Mcgarrybowen creates hundreds of pieces of social creative per year for Fortune 500 CPG and Healthcare brands, on platforms including Facebook and Twitter. The social media landscape is constantly evolving especially with the introduction of video, a governing mobile-first mindset, and interactive ad units like carousels, but yet the capabilities for measuring performance on the platforms have not followed as closely.

 

Our clients constantly want to know, what creative is the most effective, drives the highest engagement rates, and the most efficient delivery? What time of day, day of week is best for posting content? What copy and creative works best? On other brands you manage, what learnings have you had?

 

But, therein lies the challenge. Answers to these questions aren’t readily available in the platforms, which export Post-Level data in raw spreadsheets with many tabs of information. Both Facebook and Twitter can only export 90 days of data at a time. So, to look at client performance over longer periods of time and compared to their respective categories, and derive performance insights that drive cyclical improvements in creative – we turned to Alteryx.  

 

Describe the working solution

Our Marketing Science team first created Alteryx workflows that blended multiple quarters and spreadsheet tabs of social data for each individual client. The goal was to take many files over several years that each contained many tabs of information, and organize it onto one single spreadsheet so that it was easily visualized and manipulated within Excel and Tableau for client-level understanding. In Alteryx, it is easy to filter out all of the unnecessary data in order to focus on the KPIs that will help drive the success of the campaigns.  We used “Post ID,” or each post’s unique identifying number, as a unifier for all of the data coming in from all tabs, so all data associated with a single Facebook post was organized onto a single row.  After all of the inputs, the data was then able to be exported onto a single tab within Excel.

 

After each client’s data was cleansed and placed into a single Excel file, another workflow was made that combined every client’s individual data export into a master file that contained all data for all brands.  From this, we can easily track performance over time, create client and vertical-specific benchmarks, and report on data efficiently and effectively.

 

Single Client Workflow

mcgarrybowen1.png

 

Multi-Client Workflow

mcgarrybowen2.png

 

Describe the benefits you have achieved

Without Alteryx, it would take countless hours to manually work with the social data in 90 day increments and manipulate the data within Excel to mimic what the Alteryx workflow export does in seconds. With all of the saved time, we are able to spend more time on the analysis of these social campaigns.  Since we are able to put more time into thoughtful analysis, client satisfaction with deeper learnings has grown exponentially.  Not only do we report out on past performance, but we can look toward the future and more real-time information to better analyze and optimize.

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

Audit Automation.PNG

 

Billing:

 

Billing.PNG

 

File Generator:

 

File Generator.PNG

Market Generator:

 

Market Data.PNG

 

Parse:

Parse.PNG

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Jennifer Jensen, Sr. Analyst In-2CRev-28px-R.pngand team members Inna Meerovich, RJ Summers

Company: mcgarrybowen 

 

mcgarrybowen is a creative advertising agency that is in the transformation business. From the beginning, mcgarrybowen was built differently, on the simple premise that clients deserve better. So we built a company committed to delivering just that. A company that believes, with every fiber of its being, that it exists to serve clients, build brands, and grow businesses.

 

Awards Category: Best Business ROI

 

Describe the problem you needed to solve 

Mcgarrybowen creates hundreds of pieces of social creative per year for Fortune 500 CPG and Healthcare brands, on platforms including Facebook and Twitter. The social media landscape is constantly evolving especially with the introduction of video, a governing mobile-first mindset, and interactive ad units like carousels, but yet the capabilities for measuring performance on the platforms have not followed as closely.

 

Our clients constantly want to know, what creative is the most effective, drives the highest engagement rates, and the most efficient delivery? What time of day, day of week is best for posting content? What copy and creative works best? On other brands you manage, what learnings have you had?

 

But, therein lies the challenge. Answers to these questions aren’t readily available in the platforms, which export Post-Level data in raw spreadsheets with many tabs of information. Both Facebook and Twitter can only export 90 days of data at a time. So, to look at client performance over longer periods of time and compared to their respective categories, and derive performance insights that drive cyclical improvements in creative – we turned to Alteryx.  

 

Describe the working solution

Our Marketing Science team first created Alteryx workflows that blended multiple quarters and spreadsheet tabs of social data for each individual client. The goal was to take many files over several years that each contained many tabs of information, and organize it onto one single spreadsheet so that it was easily visualized and manipulated within Excel and Tableau for client-level understanding. In Alteryx, it is easy to filter out all of the unnecessary data in order to focus on the KPIs that will help drive the success of the campaigns.  We used “Post ID,” or each post’s unique identifying number, as a unifier for all of the data coming in from all tabs, so all data associated with a single Facebook post was organized onto a single row.  After all of the inputs, the data was then able to be exported onto a single tab within Excel.

 

After each client’s data was cleansed and placed into a single Excel file, another workflow was made that combined every client’s individual data export into a master file that contained all data for all brands.  From this, we can easily track performance over time, create client and vertical-specific benchmarks, and report on data efficiently and effectively.

 

Single Client Workflow

mcgarrybowen1.png

 

Multi-Client Workflow

mcgarrybowen2.png

 

Describe the benefits you have achieved

Without Alteryx, it would take countless hours to manually work with the social data in 90 day increments and manipulate the data within Excel to mimic what the Alteryx workflow export does in seconds. With all of the saved time, we are able to spend more time on the analysis of these social campaigns.  Since we are able to put more time into thoughtful analysis, client satisfaction with deeper learnings has grown exponentially.  Not only do we report out on past performance, but we can look toward the future and more real-time information to better analyze and optimize.

Author: Mark Frisch (@MarqueeCrew), CEO

Company: MarqueeCrew

 

Awards Category: Name Your Own - Macros for the Good of All Alteryx Users

 

Describe the problem you needed to solve 

Creation of samples goes beyond random and creating N'ths.  It is crucial that samples be representative of their source populations if you are going to draw any meaningful truth from your marketing or other use cases.  After creating a sample set, how would you verify that you didn't select too many of one segment vs another?  If you're using Mosaic (r) data and there are 71 types to consider did you get enough of each type?

 

image004.png

 

Describe the working solution

Using a chi-squared test, we created a macro and published the macro to the Alteryx Macro District as well as to the CReW macros  (www.chaosreignswithin).  There are two input anchors (Population and Sample) and the configuration requires that you select a categorical variable from both inputs (the same variable content).  The output is a report that tells you if your representative or not (includes degrees of freedom and the Chi square results against a 95% confidence interval).

 

image005.jpg

 

Describe the benefits you have achieved

My client was able to avoid the costly mistake that had plagued their prior marketing initiative and was setup for success.  I wanted to share this feature with the community.  It would be awesome if it ended up helping my charity, the American Cancer Society.  Although this isn't quite as sexy as my competition, it is sexy in it's simplicity and geek factor.

 

image006.jpg

Author: Mark Frisch (@MarqueeCrew), CEO

Company: MarqueeCrew

 

Awards Category: Name Your Own - Macros for the Good of All Alteryx Users

 

Describe the problem you needed to solve 

Creation of samples goes beyond random and creating N'ths.  It is crucial that samples be representative of their source populations if you are going to draw any meaningful truth from your marketing or other use cases.  After creating a sample set, how would you verify that you didn't select too many of one segment vs another?  If you're using Mosaic (r) data and there are 71 types to consider did you get enough of each type?

 

image004.png

 

Describe the working solution

Using a chi-squared test, we created a macro and published the macro to the Alteryx Macro District as well as to the CReW macros  (www.chaosreignswithin).  There are two input anchors (Population and Sample) and the configuration requires that you select a categorical variable from both inputs (the same variable content).  The output is a report that tells you if your representative or not (includes degrees of freedom and the Chi square results against a 95% confidence interval).

 

image005.jpg

 

Describe the benefits you have achieved

My client was able to avoid the costly mistake that had plagued their prior marketing initiative and was setup for success.  I wanted to share this feature with the community.  It would be awesome if it ended up helping my charity, the American Cancer Society.  Although this isn't quite as sexy as my competition, it is sexy in it's simplicity and geek factor.

 

image006.jpg

Author:  Brodie Ruttan  (@BrodieR), Lead Analytics & Special Projects In-2CRev-28px-R.png

Company: Downer New Zealand

 

Awards Category: Name Your Own - Best Use of Alteryx SharePoint Integration

 

Describe the problem you needed to solve

I work for the largest services company in New Zealand, Downer NZ Ltd. Water services, Telecommunications, Power, Gas, Mining, Roads, Rail, Airports, Marine, and Defense etc. Our Work Streams are business to business and business to government and as such there are many different, disparate, aged data sources to work with. While we are progressing work streams on to new platforms, many of the databases and information systems we use are very dated and to keep developing them is cost prohibitive.

To keep providing our customers with the increased level of service they desire we need to keep capturing new metrics, but can't spend the money to further develop aged systems. How can we implement a solution to capture these new metrics without additional costs, and can we use the learning provided from capturing this data to develop the new information systems to operate these work streams?

 

Describe the working solution

What we have implemented at Downer is a solution whereby we develop SharePoint lists to sit alongside our current information systems to gather supplementary data about the work we do and seamlessly report on it. An example of this would be if one of our technicians is at site a Cell Mast Site (think cell/mobile phone transmitting tower) and needs to report that the work cannot be completed, but the site has been "Made Safe." "Made Safe" is not a Boolean expression available in our current information systems. This is where Alteryx comes in and provides the value. Alteryx is capable of pulling the data out of the aged system and pushing the required job details into SharePoint. Once data has been added to the SharePoint list, Alteryx can then blend the data seamlessly back into exports for reporting and monitoring purposes.

Brodie_Screenshot_Workflow.png

Describe the benefits you have achieved

Our business now has the capability of expanding legacy systems seamlessly using Alteryx and SharePoint. The cost of implementing the solution is limited only to the licensing costs of Alteryx and a SharePoint environment. Considering both of these licensing costs are sunk, we are capable of expanding systems using only the cost of time, which when using Alteryx and SharePoint is minimal. The cost benefit is immense, to upgrade or expand a legacy information system is a hugely expensive effort with little benefit to show. Legacy information systems in our environment mostly need to be migrated rather than upgraded. While we build these lists to expand our capability and keep our customers satisfied we also get the benefits of lessons learned when developing the new platform. Any information gathered in SharePoint, using Alteryx, needs to be planned for when the new information system is stood up, which saves the effort and cost of additional business analyst work.

 

We have also expanded this capability using Alteryx to pull out multi-faceted work projects for display in Gantt views in SharePoint and then to pull the updated information back into the host systems.

 

Brodie_Screenshot.png

Author:  Brodie Ruttan  (@BrodieR), Lead Analytics & Special Projects In-2CRev-28px-R.png

Company: Downer New Zealand

 

Awards Category: Name Your Own - Best Use of Alteryx SharePoint Integration

 

Describe the problem you needed to solve

I work for the largest services company in New Zealand, Downer NZ Ltd. Water services, Telecommunications, Power, Gas, Mining, Roads, Rail, Airports, Marine, and Defense etc. Our Work Streams are business to business and business to government and as such there are many different, disparate, aged data sources to work with. While we are progressing work streams on to new platforms, many of the databases and information systems we use are very dated and to keep developing them is cost prohibitive.

To keep providing our customers with the increased level of service they desire we need to keep capturing new metrics, but can't spend the money to further develop aged systems. How can we implement a solution to capture these new metrics without additional costs, and can we use the learning provided from capturing this data to develop the new information systems to operate these work streams?

 

Describe the working solution

What we have implemented at Downer is a solution whereby we develop SharePoint lists to sit alongside our current information systems to gather supplementary data about the work we do and seamlessly report on it. An example of this would be if one of our technicians is at site a Cell Mast Site (think cell/mobile phone transmitting tower) and needs to report that the work cannot be completed, but the site has been "Made Safe." "Made Safe" is not a Boolean expression available in our current information systems. This is where Alteryx comes in and provides the value. Alteryx is capable of pulling the data out of the aged system and pushing the required job details into SharePoint. Once data has been added to the SharePoint list, Alteryx can then blend the data seamlessly back into exports for reporting and monitoring purposes.

Brodie_Screenshot_Workflow.png

Describe the benefits you have achieved

Our business now has the capability of expanding legacy systems seamlessly using Alteryx and SharePoint. The cost of implementing the solution is limited only to the licensing costs of Alteryx and a SharePoint environment. Considering both of these licensing costs are sunk, we are capable of expanding systems using only the cost of time, which when using Alteryx and SharePoint is minimal. The cost benefit is immense, to upgrade or expand a legacy information system is a hugely expensive effort with little benefit to show. Legacy information systems in our environment mostly need to be migrated rather than upgraded. While we build these lists to expand our capability and keep our customers satisfied we also get the benefits of lessons learned when developing the new platform. Any information gathered in SharePoint, using Alteryx, needs to be planned for when the new information system is stood up, which saves the effort and cost of additional business analyst work.

 

We have also expanded this capability using Alteryx to pull out multi-faceted work projects for display in Gantt views in SharePoint and then to pull the updated information back into the host systems.

 

Brodie_Screenshot.png

andy_moncla_avatar.pngAuthor: Andy Moncla ( @andy_moncla ), Chief Operating Officer & Alteryx ACE In-2CRev-28px-R.png

Company: B.I. Spatial

 

Awards Category:  Best Use of Spatial

With Spatial in our company name we use Spatial analytics every day.  We use Spatial analytics to better understand consumer behavior, especially relative to the retail stores, restaurants and banks they use.  We are avid proponents and users of customer segmentation.  We rely on Experian's Mosaic within ConsumerView.  In the last 2 years we have invested heavily in understanding the appropriate use of Mobile Device Location data.  We help our clients use the mobile data for better understanding their customers as well as their competitors' customers and trade areas.

 

Describe the problem you needed to solve 

Among retail, restaurant and financial services location analysts, one of the hottest topics is using mobile device location data as a surrogate for customer intercept studies. The beauty of this data, when used properly, is that it provides incredible insight. We can define home and work trade areas, differentiate between a shopping center’s trade areas versus its anchors, understand shopping preferences, identify positive co-tenancies, and, perform customer segmentation studies. 

 

The problem, or opportunity, we wanted to solve was to: 

1. Develop a process that would allow us to clean/analyze each mobile device’s spatial data in order to determine its most probable home location 

2. Build a new, programmatic trade area methodology that would best represent the mall/shopping center visitors’ distribution 

3. Easily deliver the trade areas and their demographic attributes 

 

And, it had to scale. You see, our company entered into a partnership with UberMedia and the Directory of Major Malls to develop residence-based trade areas for every mall and shopping center in the United States and Canada – about 8,000 locations. We needed to get from 100 billion rows of raw data to 8,000 trade areas. 

 

Describe the working solution

Before I get into the details I’d like to thank Alteryx for bringing Paul DePodesta back as a Keynote Speaker this year at Inspire. Paul spoke at a previous Inspire and his advice to keep a journal was critical to the success of this project. I actually kept track of CPU and Memory usage as I was doing my best to be the most efficient. Thanks for the advice Paul. 

 

journal.png

 

Using only Alteryx Spatial, we were able to accomplish our goal. Without giving away the secret sauce, here’s what we did. We divided the task into three parts which I will describe below. 

 

1.  Data Hygiene and Analysis (8 workflows for each state and province) – The goal of this portion was to identify the most likely home location for each unique device. It is important to note that the raw data is fraught with bad data, including common device identifiers, false location data and location points that could not be a home location. To clean the data, nearly all of the 100 billion rows of data were touched dozens of times. Here are some of the details.

a. Common Device Identifiers

i. The Summarize tool was used to determine those device ID’s, which were then used within a Filter tool 

ii. Devices with improper lengths were also removed using the Filter tool 

b. False Location Data – every now and again there is a lat/long that has an inexplicably high number of devices (think tens or hundreds of thousands). These points were eliminated using algorithms utilizing the Create Points, Summarization and Formula tools, coupled with spatial filtering.

c. Couldn’t be a Home Location – For a point to be considered as a likely home location, it had to be within a populated Census Block and not within other spatial features. We downloaded the Census Blocks from the Census and, utilizing the TomTom data included within Alteryx Spatial, built a series of spatial filter files for each US state and Canadian province. To build the spatial filters (one macro with 60+ tools), we used the following spatial tools:

i. Create Points 

ii. Trade Area 

iii. Buffer 

iv. Spatial Match 

v. Distance 

vi. Spatial Process Cut 

vii. Summarize - SpatialObj Combine 

 

Once the filters were built all of the data was passed through the filters, yielding only those points that could possibly be a home location. 

 

Typically, there are over one thousand observations per device, so even after the filtering there was work left to be done. We built a series of workflows that took advantage of the Calgary tools so that we could analyze each device, individually. Since every device record was timestamped, our workflows were able to identify clusters of activity over time and calculate the most likely home location. Tools critical to this process included: 

  • Sort 
  • Tile 
  • Multi-row Formula 
  • Calgary Join and Input 
  • Formula 
  • Create Points 
  • Trade Area 
  • Distance 

The Hygiene portion of this process reduced 100 billion rows of raw data to about 45 million likely home locations. 

 

2.   Trade Area Delineation (4 workflows/macros for each mall and shopping center, run iteratively until capture rate was achieved) – We didn’t want to manually delineate thousands of trade areas. We did want a consistent, programmatic methodology that could be run within Alteryx. In short, we wanted the trade area method to produce polygons that depicted concentrations of visitors without including areas that didn’t contribute. We also didn’t want to predefine the extent of the trade areas; i.e. 20 minutes. We wanted the data to drive the result. This is what we did.

a. Devised a Nearest Neighbor Methodology and embedded it within a Trade Area Macro – Creates a trade area based on each visitor’s proximity to other visitors. Tools used in this Macro include:

i. Calgary 

ii. Calgary Join 

iii. Distance 

iv. Sort 

v. Running Total 

vi. Filter 

vii. Find Nearest 

viii. Tile 

ix. Summarize – SpatialObj Combine 

x. Poly-Split 

xi. Buffer 

xii. Smooth 

xiii. Spatial Match 

 

b. Nest the Trade Area Macro within an Iterative Macro – By placing the Trade Area Macro within the Iterative Macro Alteryx allow the Trade Area Macro to run multiple scenarios until the trade area capture rate is achieved 

c. Nest the Iterative Macro within a Batch Macro – Nesting the Iterative Macro within the Batch Macro allows us to run an entire state at once 

 

The resultant trade areas do a great job of depicting where the visitors live. Although rings and drive times are great tools, especially when considering new sites, trade areas based on behavior are superior. For the shopping center below, a ring would have included areas with low visitor concentrations, but high populations. 

 

trade area with ring.png

 

3.  Trade Area Attributed Collection and Preparation (15 workflows) – Not everyone in business has mapping software but many are using Tableau. We decided that we could broaden our audience if we’d simply make our trade areas available within Tableau. 

 

Using Alteryx, we were able to easily export our trade areas for Tableau. 

Tableau - trade area.png

 

Build Zip Code maps. 

 

Tableau - zip code contribution.png

 

For our clients that use Experian’s Mosaic or PopStats demographics, Alteryx allows us to attach the trade area attributes. 

Tableau - mosaic bubbles.png

Tableau - PopStats.png

 

Describe the benefits you have achieved

The benefits we have achieved are incredible. 

 

The impact to our business is that both our client list and industry coverage have more than doubled without having to add headcount. By year end, we expect our clients’ combined annual sales to top $250 billion. Our own revenues are on pace to triple. 

 

Our clients are abandoning older customer intercept methods and depending on us. 

 

Operationally, we have repeatable processes that are lightning fast. We can now produce a store or shopping center’s trade area in minutes. Our new trade methodology has been very well received and requested. 

 

Personally, Alteryx has allowed me to harness my nearly 30 years of spatial experience and create repeatable processes and to continually learn and get better. It’s fun to be peaking almost 30 years into my career. 

 

Since we have gone to market with the retail trade area product we have heard “beautiful”, “brilliant” and “makes perfect sense.” Everyone loves a pat on the back, but, what we really like hearing is “So, what’s Alteryx?” and “Can we get pricing?” 

Author: Kristin Scholer (@kscholer), Insight Manager In-2C-14px.png
Company: Ansira
Awards Category: Most Time Saved

 

Ansira, an industry-leading marketing agency in integrated real-time customer engagement, activates big data through advanced analytics, advertising technology, programmatic media and personalized customer experiences. Ansira leverages superior marketing intelligence to build deeper, more effective relationships with consumers and the retail channel partners that engage them on the local level. Marketing intelligence is infused across all disciplines and executed through digital, direct, social, mobile, media and creative execution, marketing automation, co-op and trade promotion. 

 

Describe the problem you needed to solve

As a data-driven advertising agency, Ansira is constantly profiling customer behavior for a variety of clients in industries such as quick service restaurants, automotive brands and large retailers. Ansira’s Analytics team heavily utilizes media and consumer research that comes from the MRI Survey of the American Consumer to create Customer Behavior Reports. This large survey provides a vast database of demographics, psychographics, media opinions and shopper behavior that give insights into the actions and behaviors of the U.S. consumer. These insights help Ansira better understand consumers for new business perspectives as well as develop strategies for existing clients.


The challenge the Analytics team faced was that these rich insights were not easy to format, interpret or analyze. The data is accessed through an online portal and exported into an Excel format that does not make the data easy to manipulate. Depending on the project requirements, it could take an analyst 4-8 hours to format the data, identify survey responses that are statistically significant, build out a report to display all the information and write up a full summary. This is not cost effective and it became clear that a better way to transform this data was needed if Ansira wanted to utilize it on a regular basis.

 

Describe the working solution

After using Alteryx to format unfriendly Excel output for many projects, it was clear to the Analytics team that Alteryx could also be a solution for speeding up the Customer Behavior Report process. In about two days, one team member was able to create an Alteryx workflow that did all of the Excel formatting in just three seconds (this was generally taking over an hour to do manually).

 

Then Alteryx was taken one step further as formula tools were integrated to identify which behaviors were statistically significant for an analysis (this was taking 1-2 hours to work through manually). Next, the process was simplified one more time by incorporating the reporting tools to create a full report of all the data needed in the form of a PDF. The report even included color coding to easily identify statistically significant behaviors. Not only did this create a beautiful report in seconds but made key behaviors easy to identify, thus taking the analysis and summary process from 2-3 hours down to 15-30 minutes.

 

Describe the benefits you have achieved

The process that was created in Alteryx has allowed the Ansira Analytics team to offer Customer Behavior Reports to New Business and Strategy departments that can be turned around in a day instead of a week. If a full analysis is not needed, the Analytics team can turn around just the PDF data report in as little as 15 minutes (see picture below). This allows Ansira to gain additional direction on who is the consumer they are targeting, which can be instrumental in creating a new campaign.

 

31.jpg


To make this process even easier, the Analytics team has created a request form (see picture below) that anyone at Ansira can use to identify the behaviors they are interested in seeing for their client. Once the request form is received by the Analytics team, they can do a quick data pull from the MRI online portal, update the Alteryx workflow and have a full report created in under an hour.

32.jpg

 

Ansira recently had a consumer packaged goods client where the Strategy team needed to learn more about the difference in behavior patterns between Millennials and Non-Millennials who purchased 16 specific products. The Analytics team was able to pull data from MRI on these 16 products, run it through the Customer Behavior Report workflow and create 16 individual reports for each product comparing Millennials and Non-Millennials purchase behaviors in less than 4 hours. Without Alteryx, this would have taken a single analyst almost a full week to complete and likely would have never even been a possibility due to budget and capacity constraints.

 

Creating these Consumer Behavior Reports have become a regular occurrence with two to three requests coming into the Analytics department each week. With the help of Alteryx, these reports have become a large asset to Ansira as they provide very impactful information without a lot of effort.

ksnow.pngAuthor: Keith Snow (@ksnow), President/Data Scientist 

Company: B2E Direct Marketing Twitter_logo_blue.png  In-2CRev-28px-R.png fb-art.jpg

 

Awards Category: Best 'Alteryx For Good' Story

On December 1st, 2015, which was "Giving Tuesday", a global day dedicated to giving back, B2E Direct Marketing announced a newly created grant program for 2016 called 'Big Data for Non-Profits'.  B2E Direct Marketing is a business offering Big Data, Visual Business Intelligence and Database Marketing solutions.

 

Non-profit organizations are a crucial part of our society, providing help to the needy, education for a lifetime, social interactions and funds for good causes.

 

Describe the problem you needed to solve 

While serving on three non-profit boards, Keith Snow, President of B2E, became aware that data is among the most important, under-used and least maintained asset of a non-profit. 

 

B2E_Volunteer.png"The 'Big Data for Non-Profits' Grant program was born out of a vision that we had at B2E to give back to our community. We wanted to offer non-profits the same visual business intelligence and database marketing services that we offer our other clients." says Snow.

 

The grant program includes the following services free of charge to the winning organization in the month for which they are selected:

  • Data Hygiene (clean up donor file)
  • Data Append (age, income, gender, marital status, lifestyle segmentation, and more)
  • Detailed donor analysis and overview reports

 

Each month in 2016, B2E will choose one non-profit from those that apply through www.nonprofit360marketing.com. Award recipient applications are reviewed by a panel selected by B2E and awards are given based upon how the services will be used and to further the organization's goals. The grant program began accepting applications from eligible 501(c)(3) non-profits at the end of December and has already completed work on three organizations so far this year.

 

"We are excited about using Alteryx to help non-profits expand their mission and to better serve our communities." says Snow.

 

Describe the working solution

B2E has an initial consultation meeting with each non-profit where the goals and takeaways of the 'Big Data for Non-Profits' program is discussed.

 

We identify current data sources that the non-profit has available, and request up to 48 months of donor contact and giving information.  Minimal information is requested from the non-profit as we know great value can be added using Alteryx Designer.

  • Name                                                                         
  • Address, City, State, Zip
  • Phone
  • Date of Donation
  • Amount of Donation
  • Campaign
  • B2E_Alteryx.pngDonation type: i.e. cash, check, soft credit, etc. 


B2E has created Alteryx workflows to perform donor file hygiene. Since we have licensed the data package, we take advantage of the CASS, Zip4 Coder and Experian Geodemographic append and TomTom capabilities.

 

All donor data is address standardized to meet postal standards and duplicates within their database are identified.  Once the data is updated to meet our standards, we process the files against the National Change of Address and the National Deceased database. 

 

The next step is taking the donor's contact information and appending demographics at the individual and household level (age, income, gender, marital, age of home, Mosaic segmentation, etc.) using the Alteryx Experian add-on product.  Alteryx Designer is invaluable for this process as we manipulate the donor data to be more useful for the non-profit.

 

B2E_DonationTableau.pngAlteryx' ability to export Tableau Extract files are key for this program to be successful. We have created key Tableau dashboards that highlight the following:

a. Consumer demographics

b. Mosaic marketing segmentation

c. Campaign or donation source

d. Donation seasonality / giving analysis

e. Pareto (80/20 Rule): to identify and profile the 20% of the donors who contribute 80% of the revenue

f. Geography (city, zip, county, metro area)

 

Once the data is in the Tableau Extract, business intelligence analysis is performed with visualization that is easy to understand and immediately actionable by the non-profit.  Tableau packaged workbooks are created for each non-profit so they have access to interactive analytics to help them make quick and immediate business decisions for their organization.

 

Describe the benefits you have achieved

B2E provides a niche service that many non-profits do not have the knowledge, tools or budget to complete on their own.

 

The benefits to each non-profit includes the following:

  1. The donor data from each non-profit can now be processed in days instead of weeks using Alteryx. This allows B2E the maximum ability to help more organizations. In the past, we only worked with one non-profit per year. Our 2016 goal is to work with twelve.
  2. A clean donor contact file with updated addresses, deceased individuals flag and duplicated merged is returned to the organization. Many non-profits send out direct mail, they immediately see their deliverability rates increase by more than 15% and return mail rates decrease. The cost for printing and postage is optimized as well.
  3. The best way to get your current donors to give more is to truly understand what they look like. Understand the donor's life stage, giving history, demographics, lifestyle characteristics, media preferences and digital behavior is key for success. Targeting a donor in a way that resonates with them has lead to an increase in giving. 
  4. All non-profits want access to new donors. A profile identifies what the best donor characteristics look like. Since B2E can also acquire direct mail and email lists, we help the non-profit find "look-alike" individuals who have never donated to their organization.
  5. B2E's goal is to help each non-profit to maximize the current donations coming into their organization so they can keep their expenses and overhead lower as well as offer them a free service they would not have otherwise acquired.

 

The impact to each non-profit is huge, but the impact to B2E is just as great as we are allowed to use a great tool to be a leader in Iowa as a company that truly gives back to our community all year long. As of April, 2016, we have provided services for:

  • Big Brothers Big Sisters of Iowa
  • Children's Cancer Connection
  • Youth Emergency Services and Shelter of Iowa
  • Governors District Alliance
  • Easter Seals Iowa

Author: Kristin Scholer (@kscholer), Insight Manager In-2C-14px.png
Company: Ansira
Awards Category: Most Time Saved

 

Ansira, an industry-leading marketing agency in integrated real-time customer engagement, activates big data through advanced analytics, advertising technology, programmatic media and personalized customer experiences. Ansira leverages superior marketing intelligence to build deeper, more effective relationships with consumers and the retail channel partners that engage them on the local level. Marketing intelligence is infused across all disciplines and executed through digital, direct, social, mobile, media and creative execution, marketing automation, co-op and trade promotion. 

 

Describe the problem you needed to solve

As a data-driven advertising agency, Ansira is constantly profiling customer behavior for a variety of clients in industries such as quick service restaurants, automotive brands and large retailers. Ansira’s Analytics team heavily utilizes media and consumer research that comes from the MRI Survey of the American Consumer to create Customer Behavior Reports. This large survey provides a vast database of demographics, psychographics, media opinions and shopper behavior that give insights into the actions and behaviors of the U.S. consumer. These insights help Ansira better understand consumers for new business perspectives as well as develop strategies for existing clients.


The challenge the Analytics team faced was that these rich insights were not easy to format, interpret or analyze. The data is accessed through an online portal and exported into an Excel format that does not make the data easy to manipulate. Depending on the project requirements, it could take an analyst 4-8 hours to format the data, identify survey responses that are statistically significant, build out a report to display all the information and write up a full summary. This is not cost effective and it became clear that a better way to transform this data was needed if Ansira wanted to utilize it on a regular basis.

 

Describe the working solution

After using Alteryx to format unfriendly Excel output for many projects, it was clear to the Analytics team that Alteryx could also be a solution for speeding up the Customer Behavior Report process. In about two days, one team member was able to create an Alteryx workflow that did all of the Excel formatting in just three seconds (this was generally taking over an hour to do manually).

 

Then Alteryx was taken one step further as formula tools were integrated to identify which behaviors were statistically significant for an analysis (this was taking 1-2 hours to work through manually). Next, the process was simplified one more time by incorporating the reporting tools to create a full report of all the data needed in the form of a PDF. The report even included color coding to easily identify statistically significant behaviors. Not only did this create a beautiful report in seconds but made key behaviors easy to identify, thus taking the analysis and summary process from 2-3 hours down to 15-30 minutes.

 

Describe the benefits you have achieved

The process that was created in Alteryx has allowed the Ansira Analytics team to offer Customer Behavior Reports to New Business and Strategy departments that can be turned around in a day instead of a week. If a full analysis is not needed, the Analytics team can turn around just the PDF data report in as little as 15 minutes (see picture below). This allows Ansira to gain additional direction on who is the consumer they are targeting, which can be instrumental in creating a new campaign.

 

31.jpg


To make this process even easier, the Analytics team has created a request form (see picture below) that anyone at Ansira can use to identify the behaviors they are interested in seeing for their client. Once the request form is received by the Analytics team, they can do a quick data pull from the MRI online portal, update the Alteryx workflow and have a full report created in under an hour.

32.jpg

 

Ansira recently had a consumer packaged goods client where the Strategy team needed to learn more about the difference in behavior patterns between Millennials and Non-Millennials who purchased 16 specific products. The Analytics team was able to pull data from MRI on these 16 products, run it through the Customer Behavior Report workflow and create 16 individual reports for each product comparing Millennials and Non-Millennials purchase behaviors in less than 4 hours. Without Alteryx, this would have taken a single analyst almost a full week to complete and likely would have never even been a possibility due to budget and capacity constraints.

 

Creating these Consumer Behavior Reports have become a regular occurrence with two to three requests coming into the Analytics department each week. With the help of Alteryx, these reports have become a large asset to Ansira as they provide very impactful information without a lot of effort.

Authors: Irina Mihai (@irina_mihai) , Web Analyst 

                  Johannes Wagner, Senior Business Analyst

Company: Adidas International Trading B.V.

 

Awards Category: Name Your Own - Creating the New

 

Describe the problem you needed to solve 

The ecommerce business division was facing the challenge of keeping track and steering the performance of over 9000 articles.

 

Senior management had an overview of top level numbers but the actual people who could take action and steer the business on operational level had limited information.

 

Merchandizers tracked the sales of only most important product franchises which generated roughly 60% of the business, but they did not have an overview of article size availability and warehouse stock which was vital in order to know whether getting more online traffic for the article would lead to more sales or actually disappointed customers who didn't find their size. Besides stock information, merchandizers also needed BI data and web analytics data in order to have a holistic understanding of article and franchise performance, a situation which caused delays in acting upon information and steering the business proactively.

 

Even so, the full product range and especially the low-key franchises (40% of the business) were reported on an ad-hoc basis. No actions were taken on the less important franchises which led to unrealized opportunities, as unsold products are heavily discounted at the end of the season.

 

Given this complex business environment and time needed to get hold of data which even becomes obsolete before reaching the relevant stakeholders in a digestible format, we needed to give transparency on all product franchises and provide all the relevant information needed to take actions and drive the business on both aggregated and granular level, in real time, in one place, available to everyone, in an automated way.

 

To sum up, the drivers that led to a new way of working within analytics were:

 

  • Tracking ongoing performance on all articles improves our margin so that we can drive sales during the season and avoid heavy discounting at the end of the season. Offering too many discounts also has a negative long-term impact on the brand and educates consumers to buy on discount, so we wanted to make sure we maximize opportunities within season.
  • Besides immediate financial returns, we are also thinking of the consumer experience and the fact that not finding their desired sizes online disappoints customers. Being able to drive demand planning proactively and ensure enough supply is available is a way to keep customers happy and returning to our site.

 

Describe the working solution

Alteryx has allowed us to tap into multiple sources of data in a fast, scalable way not possible before, which allows us to be truly agile and data driven as an organization.

 

On a high level, the data sources used in the workflow are:

  • BI data incl. sales data and standard margin per article per day
  • Waiting List data from the CRM system indicating the number of times an out of stock product was placed on the waiting list
  • Article master data from the range management application
  • Demand planning master data with the estimated bought quantity per size which defines the relative importance of each size of an article
  • Web analytics data for product views and conversion rate  
  • Stock quantity data from the online platform with the daily stock snapshot on size level
  • Product range files manually maintained for retail intro date,  marketing campaign information, and original sales forecast quantity per month

 

  1. There are 3 work streams used in the main workflow:
    1.1 Calculation of daily sales forecasts per article number based on the product range files and master data file.

Several operations are done to clean up the data but the most important part is transforming the monthly forecast to a daily level also taking into account the retail intro date. For example if an article has a retail intro date in the middle of the month, we only generate a forecast for the days after that date and not before, to maintain accuracy.

 

Picture1.png

 

1.2 Data cleanse operations done on web analytics and BI data and subsequent join on article and day level

 

For each data type we have created a historical Alteryx database that gets unioned with new cleansed data, which then gets written into the historical database.

 

Picture2.png

 

1.3 Join of the daily sales forecast with the web analytics data, BI data and wishlist data on article and day level

Picture3.png

 

Here we also calculate the actual retail intro date for each article based on the first day when the product gets online traffic, thus allowing us visibility on products that were launched late.

 

  1. In a second workflow we calculate the stock availability per article size and size and buy availability per article. This is based on the master data file indicating the buy percentage per size and article and stock snapshot indicating the size availability per article. The output is a Tableau data extract.

Picture4.png

 

The outputs of the two workflows are then visualized in a Tableau dashboard that has a flow-like structure allowing users to see performance of the product franchises on high level and also drill down into details on article level:

 

Picture1.png

 

Picture2.png

 

Picture3.png

 

Picture4.png

 

 

Describe the benefits you have achieved

First of all, without Alteryx the Trading Dashboard would not have been possible due to the sheer amount of data sitting in different systems and manual work involved in retrieving and combining it at the same level of granularity.

Alteryx has allowed us the possibility to blend a variety of data sources in a scalable way and achieve the following business benefits:

 

  • In terms of time savings, prior to using Alteryx, two full time employees would have been needed to compile an in-season daily snapshot of the most important product franchises (60% of the business) with all the relevant metrics. By the time this report reached stakeholders, the information would have been obsolete and irrelevant to quickly react to consumer behavior in real time. Now with the help of Alteryx it takes 10 minutes per day for the analytics team to provide a holistic dashboard to both senior management and employees who can take quick decisions and steer the business based on real-time data.
  • Increased revenue and margin optimization: Our merchandisers and category managers now have a daily complete overview of how each and every single article is performing. Due to the exploratory and intuitive nature of the dashboard (from top level to detailed article level and coloring based on forecast achievement) they can easily identify which product franchises and individual products are falling behind sales forecast and what specific levers to pull in order to increase sales. Example actions are driving more traffic, improving on-site merchandising, restocking particular sizes, decreasing the price.
  • Customer satisfaction: as sizes are restocked faster than before due to the new proactive way of working of the demand planning department, consumers are also happier that they can purchase their desired sizes. This leads to more customers returning to our site because they know here they can find sizes that are not available in retail stores.

 

We have recently introduced the Trading Dashboard and there is already a mindset shift happening where different departments work more closely together to identify opportunities and act based on the data. We believe Alteryx has enabled us to reach our ambitious growth targets, improve customer satisfaction and operate as a data driven organization.

 

ksnow.pngAuthor: Keith Snow (@ksnow), President/Data Scientist 

Company: B2E Direct Marketing Twitter_logo_blue.png  In-2CRev-28px-R.png fb-art.jpg

 

Awards Category: Best 'Alteryx For Good' Story

On December 1st, 2015, which was "Giving Tuesday", a global day dedicated to giving back, B2E Direct Marketing announced a newly created grant program for 2016 called 'Big Data for Non-Profits'.  B2E Direct Marketing is a business offering Big Data, Visual Business Intelligence and Database Marketing solutions.

 

Non-profit organizations are a crucial part of our society, providing help to the needy, education for a lifetime, social interactions and funds for good causes.

 

Describe the problem you needed to solve 

While serving on three non-profit boards, Keith Snow, President of B2E, became aware that data is among the most important, under-used and least maintained asset of a non-profit. 

 

B2E_Volunteer.png"The 'Big Data for Non-Profits' Grant program was born out of a vision that we had at B2E to give back to our community. We wanted to offer non-profits the same visual business intelligence and database marketing services that we offer our other clients." says Snow.

 

The grant program includes the following services free of charge to the winning organization in the month for which they are selected:

  • Data Hygiene (clean up donor file)
  • Data Append (age, income, gender, marital status, lifestyle segmentation, and more)
  • Detailed donor analysis and overview reports

 

Each month in 2016, B2E will choose one non-profit from those that apply through www.nonprofit360marketing.com. Award recipient applications are reviewed by a panel selected by B2E and awards are given based upon how the services will be used and to further the organization's goals. The grant program began accepting applications from eligible 501(c)(3) non-profits at the end of December and has already completed work on three organizations so far this year.

 

"We are excited about using Alteryx to help non-profits expand their mission and to better serve our communities." says Snow.

 

Describe the working solution

B2E has an initial consultation meeting with each non-profit where the goals and takeaways of the 'Big Data for Non-Profits' program is discussed.

 

We identify current data sources that the non-profit has available, and request up to 48 months of donor contact and giving information.  Minimal information is requested from the non-profit as we know great value can be added using Alteryx Designer.

  • Name                                                                         
  • Address, City, State, Zip
  • Phone
  • Date of Donation
  • Amount of Donation
  • Campaign
  • B2E_Alteryx.pngDonation type: i.e. cash, check, soft credit, etc. 


B2E has created Alteryx workflows to perform donor file hygiene. Since we have licensed the data package, we take advantage of the CASS, Zip4 Coder and Experian Geodemographic append and TomTom capabilities.

 

All donor data is address standardized to meet postal standards and duplicates within their database are identified.  Once the data is updated to meet our standards, we process the files against the National Change of Address and the National Deceased database. 

 

The next step is taking the donor's contact information and appending demographics at the individual and household level (age, income, gender, marital, age of home, Mosaic segmentation, etc.) using the Alteryx Experian add-on product.  Alteryx Designer is invaluable for this process as we manipulate the donor data to be more useful for the non-profit.

 

B2E_DonationTableau.pngAlteryx' ability to export Tableau Extract files are key for this program to be successful. We have created key Tableau dashboards that highlight the following:

a. Consumer demographics

b. Mosaic marketing segmentation

c. Campaign or donation source

d. Donation seasonality / giving analysis

e. Pareto (80/20 Rule): to identify and profile the 20% of the donors who contribute 80% of the revenue

f. Geography (city, zip, county, metro area)

 

Once the data is in the Tableau Extract, business intelligence analysis is performed with visualization that is easy to understand and immediately actionable by the non-profit.  Tableau packaged workbooks are created for each non-profit so they have access to interactive analytics to help them make quick and immediate business decisions for their organization.

 

Describe the benefits you have achieved

B2E provides a niche service that many non-profits do not have the knowledge, tools or budget to complete on their own.

 

The benefits to each non-profit includes the following:

  1. The donor data from each non-profit can now be processed in days instead of weeks using Alteryx. This allows B2E the maximum ability to help more organizations. In the past, we only worked with one non-profit per year. Our 2016 goal is to work with twelve.
  2. A clean donor contact file with updated addresses, deceased individuals flag and duplicated merged is returned to the organization. Many non-profits send out direct mail, they immediately see their deliverability rates increase by more than 15% and return mail rates decrease. The cost for printing and postage is optimized as well.
  3. The best way to get your current donors to give more is to truly understand what they look like. Understand the donor's life stage, giving history, demographics, lifestyle characteristics, media preferences and digital behavior is key for success. Targeting a donor in a way that resonates with them has lead to an increase in giving. 
  4. All non-profits want access to new donors. A profile identifies what the best donor characteristics look like. Since B2E can also acquire direct mail and email lists, we help the non-profit find "look-alike" individuals who have never donated to their organization.
  5. B2E's goal is to help each non-profit to maximize the current donations coming into their organization so they can keep their expenses and overhead lower as well as offer them a free service they would not have otherwise acquired.

 

The impact to each non-profit is huge, but the impact to B2E is just as great as we are allowed to use a great tool to be a leader in Iowa as a company that truly gives back to our community all year long. As of April, 2016, we have provided services for:

  • Big Brothers Big Sisters of Iowa
  • Children's Cancer Connection
  • Youth Emergency Services and Shelter of Iowa
  • Governors District Alliance
  • Easter Seals Iowa

Authors: Irina Mihai (@irina_mihai) , Web Analyst 

                  Johannes Wagner, Senior Business Analyst

Company: Adidas International Trading B.V.

 

Awards Category: Name Your Own - Creating the New

 

Describe the problem you needed to solve 

The ecommerce business division was facing the challenge of keeping track and steering the performance of over 9000 articles.

 

Senior management had an overview of top level numbers but the actual people who could take action and steer the business on operational level had limited information.

 

Merchandizers tracked the sales of only most important product franchises which generated roughly 60% of the business, but they did not have an overview of article size availability and warehouse stock which was vital in order to know whether getting more online traffic for the article would lead to more sales or actually disappointed customers who didn't find their size. Besides stock information, merchandizers also needed BI data and web analytics data in order to have a holistic understanding of article and franchise performance, a situation which caused delays in acting upon information and steering the business proactively.

 

Even so, the full product range and especially the low-key franchises (40% of the business) were reported on an ad-hoc basis. No actions were taken on the less important franchises which led to unrealized opportunities, as unsold products are heavily discounted at the end of the season.

 

Given this complex business environment and time needed to get hold of data which even becomes obsolete before reaching the relevant stakeholders in a digestible format, we needed to give transparency on all product franchises and provide all the relevant information needed to take actions and drive the business on both aggregated and granular level, in real time, in one place, available to everyone, in an automated way.

 

To sum up, the drivers that led to a new way of working within analytics were:

 

  • Tracking ongoing performance on all articles improves our margin so that we can drive sales during the season and avoid heavy discounting at the end of the season. Offering too many discounts also has a negative long-term impact on the brand and educates consumers to buy on discount, so we wanted to make sure we maximize opportunities within season.
  • Besides immediate financial returns, we are also thinking of the consumer experience and the fact that not finding their desired sizes online disappoints customers. Being able to drive demand planning proactively and ensure enough supply is available is a way to keep customers happy and returning to our site.

 

Describe the working solution

Alteryx has allowed us to tap into multiple sources of data in a fast, scalable way not possible before, which allows us to be truly agile and data driven as an organization.

 

On a high level, the data sources used in the workflow are:

  • BI data incl. sales data and standard margin per article per day
  • Waiting List data from the CRM system indicating the number of times an out of stock product was placed on the waiting list
  • Article master data from the range management application
  • Demand planning master data with the estimated bought quantity per size which defines the relative importance of each size of an article
  • Web analytics data for product views and conversion rate  
  • Stock quantity data from the online platform with the daily stock snapshot on size level
  • Product range files manually maintained for retail intro date,  marketing campaign information, and original sales forecast quantity per month

 

  1. There are 3 work streams used in the main workflow:
    1.1 Calculation of daily sales forecasts per article number based on the product range files and master data file.

Several operations are done to clean up the data but the most important part is transforming the monthly forecast to a daily level also taking into account the retail intro date. For example if an article has a retail intro date in the middle of the month, we only generate a forecast for the days after that date and not before, to maintain accuracy.

 

Picture1.png

 

1.2 Data cleanse operations done on web analytics and BI data and subsequent join on article and day level

 

For each data type we have created a historical Alteryx database that gets unioned with new cleansed data, which then gets written into the historical database.

 

Picture2.png

 

1.3 Join of the daily sales forecast with the web analytics data, BI data and wishlist data on article and day level

Picture3.png

 

Here we also calculate the actual retail intro date for each article based on the first day when the product gets online traffic, thus allowing us visibility on products that were launched late.

 

  1. In a second workflow we calculate the stock availability per article size and size and buy availability per article. This is based on the master data file indicating the buy percentage per size and article and stock snapshot indicating the size availability per article. The output is a Tableau data extract.

Picture4.png

 

The outputs of the two workflows are then visualized in a Tableau dashboard that has a flow-like structure allowing users to see performance of the product franchises on high level and also drill down into details on article level:

 

Picture1.png

 

Picture2.png

 

Picture3.png

 

Picture4.png

 

 

Describe the benefits you have achieved

First of all, without Alteryx the Trading Dashboard would not have been possible due to the sheer amount of data sitting in different systems and manual work involved in retrieving and combining it at the same level of granularity.

Alteryx has allowed us the possibility to blend a variety of data sources in a scalable way and achieve the following business benefits:

 

  • In terms of time savings, prior to using Alteryx, two full time employees would have been needed to compile an in-season daily snapshot of the most important product franchises (60% of the business) with all the relevant metrics. By the time this report reached stakeholders, the information would have been obsolete and irrelevant to quickly react to consumer behavior in real time. Now with the help of Alteryx it takes 10 minutes per day for the analytics team to provide a holistic dashboard to both senior management and employees who can take quick decisions and steer the business based on real-time data.
  • Increased revenue and margin optimization: Our merchandisers and category managers now have a daily complete overview of how each and every single article is performing. Due to the exploratory and intuitive nature of the dashboard (from top level to detailed article level and coloring based on forecast achievement) they can easily identify which product franchises and individual products are falling behind sales forecast and what specific levers to pull in order to increase sales. Example actions are driving more traffic, improving on-site merchandising, restocking particular sizes, decreasing the price.
  • Customer satisfaction: as sizes are restocked faster than before due to the new proactive way of working of the demand planning department, consumers are also happier that they can purchase their desired sizes. This leads to more customers returning to our site because they know here they can find sizes that are not available in retail stores.

 

We have recently introduced the Trading Dashboard and there is already a mindset shift happening where different departments work more closely together to identify opportunities and act based on the data. We believe Alteryx has enabled us to reach our ambitious growth targets, improve customer satisfaction and operate as a data driven organization.

 

Author: Thomas Ayme, Manager, Business Analytics

Company: Adidas International Trading B.V

 

Awards Category: Name Your Own - Best Planning and Operational Use

 

Describe the problem you needed to solve 

As a new and successful business adidas Western Europe eCommerce keeps on growing faster and faster; new services are being launched every week, an increasing number of marketing campaigns are being driven simultaneously, etc. This leads to more and more products having to be shipped out every day to our end consumers.

 

This strong growth leads to an exponential increase of the complexities when it comes to forecasting our units and orders volumes, but also to bigger costs in case of forecasting mistakes or inaccuracies.

 

As these outbound volumes keep on increasing, we were being faced with the need to develop a new, more accurate, more detailed and more flexible operational forecasting tool.

 

Such a forecasting tool would have to cater to the complexities of having to forecast for 17 different markets rather than a single pan European entity. Indeed, warehouse operations and customer service depend on a country level forecast to plan carriers and linguistic staff. This is a very unique situation where on top of having a rapidly growing business we have to take into account local marketing events and markets specificities.

 

Finally, given the importance of ensuring consumer satisfaction through timely delivery of their orders, we also decided to provide a daily forecast for all 17 markets rather than the usual weekly format. Such a level of details improves the warehouse's shipping speed but also increase once again the difficulty of our task.

 

Describe the working solution

 

Our first challenge was to find reliable sources of information. Both business analytics (financial and historical sales data) and web analytics (traffic information) data were already available to us through SAP HANA and Adobe Analytics. However, none of our databases were capturing in a single place all information related to marketing campaigns, project launches, events, adhoc issues, etc.

 

That is why we started by building a centralized knowledge database, which contains all past and planned events that can impact our sales and outbound volumes.

 

This tool is based on an Alteryx workflow, which cleans and blends together all the different calendars used by the other eCommerce teams. In the past, bringing those files together was a struggle since some of them are based on Excel while others are on Google Sheets, moreover, all are using a different format.

 

Workflow Knowledge database.jpg

 

We made the best of this opportunity of now having a centralized event database by also developing a self-service visualization tool in Tableau, which displays all those past and future events. Such a dashboard is now used to:

 

  1. Give some background to our stakeholders about what is driving the volumes seen in the forecast.
  2. Have an overview of the business during our review the sales targets of the coming weeks, etc...

 

In a second time we created a workflow, which thanks to this new centralized event database, defines for each past and upcoming days as well as for each markets a set of "genes". These genes flag potential adhoc issues, commercial activations, level of discount, newsletter send outs, etc.

 

This gene system can then be used to define the histoical data to be used to forecast upcoming periods, by matching future and past days that share the same or at least similar genes. This is seen as the first pillar of our forecasting model.

 

The second pillar of our forecasting tool is a file containing our European weekly targets. These targets are constantly being reviewed based on new events shown in the centralized event database and current business trends. 

An Alteryx workflow derives from this target file our sales expectation for each upcoming day, market, category (full price, clearance) and article type (inline or customized). In order to do so, we use historical data defined by our genes in addition to a set of algorithms and calculate the sales impact ratio of each market and category. These ratios are then used to allocate a target to each one of the combination.

 

worlkflow forecasting.png

 

Finally, both pillars are brought together and we derive in a final Alteryx workflow, how many orders and units will have to be placed in each markets and from which article type.

 

However, since certain periods of time have genes combinations that cannot be matched, our working solution also gives us the flexibility to manually override the results. These forecast volumes are then shared with the team, warehouse, customer service call centers, etc. through a Tableau dashboard.

 

Knowledge database.png

 

Describe the benefits you have achieved

Thanks to the work that went into developing this new forecasting model in Alteryx, the adidas WE eCommerce business ended up getting:

 

  • A more accurate forecasting model, which allows for a better planning of our operations.
  • Reduced operational costs.
  • A more detailed forecast as we can now forecast on a daily level, when past methods required much more work and limited us to a weekly forecast.
  • A flexible forecasting model that can easily be modified to include new services and sales channels.
  • A forecast dashboard that lets us easily communicate our forecast to an ever growing number of stakeholders.
  • A centralized event “calendar” that can be used by the entire department for much more than simply understanding the forecast (e.g. it is used to brief in Customer Service teams on upcoming events).
  • A massive amount of free time that can be used to drive other analyses, as it is not required from us anymore to manually join together different marketing calendars and other sources of information, create manual overviews of the upcoming weeks, manually split our weekly sales target, etc.