cancel
Showing results for 
Search instead for 
Did you mean: 

At Alteryx, we’re constantly impressed with the amazing things that you do with our software. The Alteryx Analytics Excellence Awards recognize and celebrate your best Alteryx success stories.

The submission period for the 2017 Alteryx Analytics Excellence Awards has ended. A HUGE Thank You to all who participated this year! We wish you the best of luck!

  • To browse qualified entries, click here.
  • Winners will be announced at Inspire 2017 in Las Vegas, June 7th.

For full rules and details, click here.


Suggest an idea

Author: Alexandra Wiegel, Tax Business Intelligence Analyst 
Company: Comcast Corp


Awards Category: Best Business ROI

 

A Corporate Tax Department is not typically associated with a Business Intelligence team sleekly manipulating and mining large data sources for insights.  Alteryx has allowed our Tax Business Intelligence team to provide incredibly useful insight to several branches of our larger Tax Department. Today, almost all of our data is in Excel or csv format and so data organization, manipulation and analysis have previously been accomplished within the confines of Excel, with the occasional Tableau for visualization. Alteryx has given us the ability to analyze, organize, and manipulate very large amounts of data from multiple sources.  Alteryx is exactly what we need to solve our colleague’s problems.


Describe the problem you needed to solve

Several weeks ago we were approached about using Alteryx to do a discovery project that would hopefully provide our colleagues further insight into the application of tax codes to customer bills. Currently, our Sales Tax Team uses two different methods to apply taxes to two of our main products respectively. The first method is to apply Tax Codes to customer bill records and then run those codes through software that generates and applies taxes to each record. The second method is more home-grown and appears to be leading to less consistent taxability on this side of our business.

 

Given that we sell services across the entire country, we wanted to explore standardization across all our markets. So, our Sales Tax team tasked us with creating a workflow that would compare the two different methods and develop a plan towards the goal of standardization and the effect it would have on every customer’s bills.

 

Describe the working solution

Our original source file was a customer level report where the records were each item (products, fees, taxes, etc.) on a customer’s bill for every customer in a given location. As it goes with data projects, our first task was to cleanse, organize, and append the data to make it uniform.

 

 

The next step was to add in the data from several data sources that we would ultimately need in order to show the different buckets of customers according to the monetary changes of their bills. Since these sources were all formatted differently and there was often no unique identifier we could use to join new data sources to our original report. Hence, we had to create a method to ensure we did not create duplicate records when using the join function. We ended up using this process multiple times (pictured below)

 

 

And so, the workflow followed. We added tax descriptions, new codes, and other information. We added calculated fields to determine the amount of tax that should be owed by each customer today, based on our current coding methods.

 

 

After we had layered in all the extra data that we would need to create our buckets, we distinguished between the two lines of business and add in the logic to determine which codes, at present, are taxable.

 

 

For the side of our business whose taxability is determine by software, you will notice that the logic is relatively simple. We added in our tax codes using the same joining method as we did above and then used a single join to a table that lists the taxable codes.

 

 

For the side of our business whose taxability is determine by using our home-grown method, you can see below that the logic is more complicated. Currently, the tax codes for this line of business are listed in such a way that requires us to parse a field and stack the resulting records in order to isolate individual codes. Once we have done this we can then apply the taxability portion. We then have to use this as a lookup for the actual record in order to determine if a record contains within the code column a tax code that has been marked as taxable. Or in other words, to apply our home-grown taxability logic is complicated, time consuming, and leaves much room for error.

 

 

Once we stacked all this data back together we joined it with the new tax code table. This will give us the new codes so that the software can be used for both lines of business. Once we know these new codes, we can simulate the process of the software and determine which of the new codes will be taxable.

 

 

Knowing whether or not codes are taxable helps us hypothesize about how problematic a geographic location may end up being for our team, but it does not tell us the dollar amount of taxes that will be changing. To know this we must output files that will be run through the real software.

 

Hence, once we have completed the above data manipulation, cleansing, and organization, we extract the data that we want to have run through the software and reformat the records to match the necessary format for the software recognition.

 

 

We created the above two macros to reformat the columns in order to simply this extensive workflow. Pictured below is the top macro. The difference between the two resides in the first select tool where we have specified different fields to be output.

 

 

After the reformatting, we output the files and send them to the software team.

 

 

When the data is returned to us, we will be able to determine the current amount of tax that is being charged to each customer as well the amount that will be charged once the codes are remapped. The difference between these two will then become our buckets of customers and our Vice President can begin to understand how the code changes will affect our customer’s bills.

 

Describe the benefits you have achieved

Although this project took several weeks to build in Alteryx, it was well worth the time invested as we will be able to utilize it for any other locations. We have gained incredible efficiency in acquiring insight on this standardization project using Alteryx. Another benefit we have seen in Alteryx is the flexibility to make minor changes to our workflow which has helped us easily customize for different locations. All of the various Alteryx tools have made it possible for the Tax Business Intelligence team to assist the Tax Department in accomplishing large data discovery projects such as this.

 

Further, we have begun creating an Alteryx app that can be run by anyone in our Tax Department. This frees up the Tax Business Intelligence team to work on other important projects that are high priority.

A common benefit theme amongst Alteryx users is that Alteryx workflows save companies large amounts of time in data manipulation and organization. Moreover, Alteryx has made it possible (where it is impossible in Excel) to handle large and complicated amounts of data and in a very user friendly environment. Alteryx will continue to be a very valuable tool which the Tax Business Intelligence team will use to help transform the Tax department into a more efficient, more powerful, and more unified organization in the coming years.

 

How much time has your organization saved by using Alteryx workflows?

We could never have done this data discovery project without using Alteryx.  It was impossible to create any process within Excel given the quantity and complexity of the data.

 

In other projects, we are able to replicate Excel reconciliation processes that are run annually, quarterly, and monthly in Alteryx.  The Alteryx workflows have saved our Tax Department weeks of manual Excel pivot table work.  Time savings on individual projects can range from a few hours to several weeks.

 

What has this time savings allowed you to do?

The time savings has been invaluable.  The Tax Department staff are now able to free themselves of the repetitive tasks in Excel, obtain more accurate results and spend time doing analysis and understanding the results of the data.  The “smarter” time spent to do analyses will help transform the Tax Department with greater opportunities to further add value to the company.

CRC_logo.pngAuthor: Renilton Soares de Oliveira - Executive Director, Conjecto

Team Members: Ricardo Mendes, Roberto Teófilo, Valter Cazassa

Company: Central de Recuperação de Crédito - CRC

Business Partner: Conjecto

 

Awards Category: From Zero to Hero

First of all, as far as we know, CRC was the first Alteryx customer in Brazil. Therefore, we were starting using Alteryx from Zero not just from a single company perspective, but also we were starting a journey that was inspiring for several other companies in Latin America. In fact, CRC’s success story was shared in the first two Alteryx User Group meetings in Brazil.

 

After starting using Alteryx in early 2014, in less than one month we have eliminated all the risky, time-consuming manual activities of one of the most important data blending processes which usually took four days of hard work of a CRC's line of business (LOB) analyst. Now, using Alteryx workflows, the whole process runs in less than 15 minutes, which has brought significant productivity improvements and more accurate results not only for this particular process, but also for several others that depend upon it. The extraordinary results obtained in the first Alteryx use case at CRC had sparked new initiatives to explore other opportunities with data blending and advanced data analytics. Alteryx is now an essential part of CRC business operations.

 

Awards Category: Best Value Driven with Alteryx

In 2015, just two years after started using Alteryx, CRC has experienced almost 30% revenue growth and a reduced 15% in its headcount. That’s a remarkable accomplishment for our business and we have no doubt that Alteryx solutions and Conjecto analytics services have played a very important role to make that happen. We love the independence and flexibility that Alteryx brought to our LOB analysts. Our users are much more productive and CRC is now a more data-driven company that it used to be. That was a big boost to our business.

 

Describe the problem you needed to solve

When CRC first started using Alteryx in the first quarter of 2014, the company was beginning a rapid expansion of its business activities that eventually led it to bring more strategic accounts to its customer base and, as a natural consequence, more data sources and complexity to its operations. In this challenging scenario, it would be virtually impossible to reach its goals without innovative, effective solutions for data analysis and decision support in different levels of the organization.

 

Alteryx solutions have been extremely helpful for us to solve different, relevant business problems. A major challenge for CRC prior to using Alteryx solutions was the need to quickly and effectively improve the productivity of daily critical and complex business processes based on data blending from different, heterogeneous sources with huge amount of information that could not be handled timely and appropriately with traditional software systems. Moreover, CRC operations performance was looking for effective ways to use machine learning algorithms to improve business performance without paying expensive software infrastructure and hiring expensive data science experts.

 

Describe the working solution

Alteryx solutions have been extremely helpful for us to solve different, relevant business problems. A major challenge for CRC that was solved with Alteryx software was the productivity of daily critical and complex business processes based on data blending from different, heterogeneous sources with huge amount of information that could not be handled timely and appropriately with traditional software systems.

 

CRC has also implemented more advanced analytics based on Alteryx predictive and spatial tools. One of the main business problems that we have successfully solved with those tools is identifying which are the individuals or companies that are most likely will respond positively to an action that will eventually lead them to paying their debts. That’s pretty much what we need to do on a daily basis to get better results to our customers and to our organization.

 

Regarding data blending functionality, CRC has taken advantage of most of data prep tools, including data preparation, parse, join, transform, developer, and inDatabase. To implement advanced analytics applications and macros, our team has used predictive (e.g., boosted, logistic regression, and decision tree models) and grouping tools (for unsupervised learning).

 

Describe the benefits you have achieved

In fact, all sponsors for this initiative recognize the extraordinary results with what have been accomplished thus far regarding business growth and customer satisfaction. Business and decision making processes have been greatly improved which had a profound impact corporate results. In 2015, just two years after started using Alteryx, CRC has experienced almost 30% revenue growth and a reduced 15% in its headcount. That’s a remarkable accomplishment and we have no doubt that Alteryx solutions and Conjecto analytics services have played a very important role to make that happen. Nowadays, CRC is well-known in Brazil as one of the most analytics-driven company in financial services. 

 

CRC-1.png

Catalyst logo.pngAuthor: Jason Claunch - President

Company: Catalyst

Business Partner: Slalom Consulting - Sean Hayward & Marek Koenig

 

Awards Category: Best Use of Alteryx for Spatial Analytics

 

The developed solution used many of the Spatial Analytics components available within Alteryx:

  • Trade Area – have user select target area to analyze
  • Spatial Match – combine multiple geospatial objects,
  • Intersection – cut objects from each other to create subject area
  • Grid tool – sub-divide the trade blocks to determine complete coverage of trade ring
  • Distance – use drivetime calculation to score and rank retailers in the vicinity

Describe the problem you needed to solve

Retail site analysis is a key part of our business and was taking up too much time with repetitive tasks that could have been easily automated.

 

Describe the working solution

To support selection of best-fit operators, Catalyst partnered with Slalom Consulting to develop a tool to identify potential uses to target for outreach and recruitment. Previously, we would have to manually build demographic profiles using tools like qGIS, ESRI, and others, but found the process to be cumbersome and quite repetitive. Demographic data was acquired at the trade bloc level, which was too granular for identify target locations and would not mesh well with the retail data.

 

Alteryx and its spatial capabilities was used in a few ways:

 

1) Minimize our retail data selection from the entire US to a selected state using the Spatial Match tool.catalyst1b.png

 

2) Create a demographic profile for each retail location that consisted of data points such as median income, population, daytime employees, and others. The data was aggregated around a 3 mile radius of the specific retail location with an Alteryx Macro composed of a Trade Area, Grid Tool, Spatial Match, and Summarization tool.

 

catalyst2-A.png catalyst2-B.png catalyst2-C.png

 

3) Using a Map Input, the user selected an area to profile and candidate retailers were output for further review.

catalyst3.png

 

4) After selecting specific retailers to do an in-depth analysis on, Alteryx would score all possible locations by distance (Drivetime Analysis) and by score (proprietary weighting of various demographic attributes). The profiled results were then used to build a client presentation; the automated profiling tool saved us countless hours and allowed us to deliver more detailed analysis for our clients.

 

catalyst4.png

 

Describe the benefits you have achieved

Using Alteryx was a massive time saver, the tool that we built took a process that normally required at least 8 hours of manual work down to merely a few minutes. This has directly benefited our bottom line by allowing us to focus on more key tasks in our client outreach and recruitment. A return-on-investment was immediately realized after we were able to close a deal with a major client using our new process.

Author: Erik Miller (@erik_miller), Sr Systems Engineer - Cyber Security Analytics

 

Awards Category: Most Time Saved

 

Describe the problem you needed to solve

My team's story starts from the ground level of analytics: no tools, no resources, no defined data sources. But our Information Security team had an idea: to be able to report out on all of Western Union's Agent Locations (think Kroger grocery stores, mom & pop shops, etc) and the risk they posed by not having certain security measures implemented - look at every PC/terminal they have to determine their individual risks (2.4 million when we started), their fraud history, their transaction limits, etc, etc. and risk-rate every one of those 500,000+ Locations. We completed a proof of concept and realized it was completely unsustainable, requiring over 100+ hours every month to be able to produce, what outwardly looked like, a simple report. We took that process and built it out in Alteryx. And with just a little over 2.5 hours with the tool, we took a process which dominated my time and turned it into a 5 ½ minute layout of time. What's more, we've turned this POC project and turned it into a full-fledged program and department, focused on risk analytics surrounding employee & contractor resource usage (malicious or uneducated insiders), customer web analytics (looking for hackers), and further Agent analytics.

 

Beyond our humble beginnings, there's the constant threat of data breaches, fraud, and malicious insiders in the Information Security world - it's the reality of the work we do. Having the ability to build out an strategic analytics program has been a huge step in the right direction in our industry and company & not an area which many other companies have been able to focus on, which also sets us ahead of the curve.

 

Describe the working solution

We are using Alteryx to assess several data sources - HR data sets for active/terminated employees & contractors, clickstream data from our digital assets and websites, security data from our Netezza system, fraud data, log files from our various security platforms, user behavior data from our UBA (User Behavior Analytics) system, Identity and Access Management attributes/entitlements, system infection logs, installed applications, etc., etc. As I've said in other talks, we don't have a data lake, we have an ocean.

 

We are currently exporting our data to Tableau tde files, Hadoop, and MySQL databases. In addition, we have started looking/experimenting with our Alteryx Server implementation (which I support for our company).

 

Describe the benefits you have achieved

Overall time savings is nearing 150 hours a month, so a massive savings and an ability for our team to stay incredibly lean - no additional FTEs needed to keep taking on more and more data and challenges. We've also been able to give visibility to the security implementations for all of our 500,000+ worldwide locations - something which we didn't have visibility to prior to now, and which helps us drive the business to implement security features where needed - based on logic, numbers, and fraud data, not feelings.

 

We also are able to provide insights into our user base - how are our employees using our assets, what are they doing that's lowering our security posture, how are they getting infected. We're providing insights which can help our company become more secure.

 

How much time has your organization saved by using Alteryx workflows?

What has this time savings allowed you to do?

With just our first workflow, we saved over 100 hours per month - so over a full FTE of time has been taken off of my plate. Alter
yx has allowed us to now only save time each month, but keep our team incredibly lean (we only have three people, and that's all we need to churn through massive amounts of security & fraud data each month).

 

So what has this time saving allowed us to do? Many, many things.

 

First, I was promoted to Sr. Systems Engineer - Cyber Security Analytics. With that change in title, also came the opportunity to build out a strategic-focused Information Security Analytics team, focused on looking at all security data throughout the company and identifying areas where we can improve our security program and posture.

 

Second, It's allowed me time to work with other departments to build out their analytics programs and help them learn to use the Alteryx tools in their respective areas.

 

Third, it's allowed my team to work on new, expanding projects with great ease.

Author: Jack Morgan (@jack_morgan), Project Management & Business Intelligence

 

Awards Category: Most Time Saved

 

After adding up the time savings for our largest projects we came up with an annual savings of 7,736 hours - yea, per year! In that time, you could run 1,700 marathons, fill 309,000 gas tanks or watch 3,868 movies!! Whaaaaaaaaaaaaat! In said time savings, we have not done any of the previously listed events. Instead, we've leveraged this time to take advantage of our otherwise unrealized potential for more diverse projects and support of departments in need of more efficiency. Other users that were previously responsible for running these processes now work on optimizing other items that are long overdue and adding value in other places by acting as project managers for other requests.

 

Describe the problem you needed to solve 

The old saying goes, Time is of the essence, and there are no exceptions here! More holistically, we brought Alteryx into our group to better navigate disparate data and build one-time workflows to create processes that are sustainable and provide a heightened level of accuracy. In a constraint driven environment my team is continuously looking for how to do things better. Whether that is faster, more accurately or with less needed oversight is up to our team. The bottom line is that Alteryx provides speed, accuracy, and agility that we never thought would be possible. Cost and the most expensive resource of all, human, has been a massive driver for us through our Alteryx journey and I'd expect that these drivers will continue as time passes us by.

 

Describe the working solution

Our processes vary from workflow to workflow, however overall we use a lot of SQL, Oracle, Teradata and SharePoint. In some workflows we blend 2 sources; in others we blend all of them. It depends on the need of the business that we are working with on any given day. Once the blending is done we do a variety of things with it, sometimes is goes to apps for self-service consumption and other times we push it into a data warehouse. However one thing that is consistent in our process is final data visualization in Tableau! Today, upwards of 95% of our workflows end up in Tableau allowing us to empower our users with self-service and analytics reporting. When using databases like SQL and Oracle we see MASSIVE gains in the use of In-Database tools. The ability for our Alteryx users to leverage such a strong no code solution creates an advantage for us in the customer service and analytics space because they already understand the data but now they have a means to get to it.

 

Audit Automation:

 

Billing:

 

 

File Generator:

 

Market Generator:

 

 

Parse:

 

Describe the benefits you have achieved

The 7,736 hours mentioned above is cumulative of 7 different processes that we rely on, on a regular basis.

 

  1. One prior process took about 9 days/month to run - we've dropped that to 30s/month!
  2. Another process required 4 days/quarter that our team was able to cut to 3 min/quarter.
  3. The third and largest workflow would have taken at estimate 5200 hours to complete and our team took 10.4 hours to do the same work!
  4. The next project was a massive one, we needed to create a tool to parse XML data into a standardized excel format. This process once took 40 hrs/month (non-standard pdf to excel) that we can run in less than 5s/month!
  5. Less impressive but still a great deal of time was when our systems and qa team contracted us to rebuild their daily reporting for Production Support Metrics. This process took them about 10 hours/month that we got to less than 15 sec/day.
  6. One of our internal QA teams asked us to assist them in speeding up their pre-work time for their weekly audit process. We automated their process that took them upwards of 65 hours/month to a process that now takes us 10 sec/week!
  7. The last of the 7 processes that have been mentioned in that our above write-up would be a process for survey data that took a team 2 hours/week to process. That same process takes our team about 20 sec/week to process.

 

We hope you've found our write-up compelling and win-worthy!

 

Author: Slaven Sljivar, Vice President, Analytics

Company: SmartDrive Systems, Inc.

 

Awards Category: Most Time Saved

 

Describe the problem you needed to solve

SmartDrive’s Analytics Team, which is approaching its 9th year in its existence in our 12-year-old company, is focused on three areas: 1) customer-facing analytics, 2) analytics supporting the internal teams, and 3) analytics as it is embedded within our product.  To support these activities, we rely a well-developed data warehousing and business intelligence stack that includes Tableau, R, SQL Server (for relational dimensional data warehouse) and SQL Server Analysis Services cubes. 

 

Alteryx, which we first started using only 5 months ago (March 2016), fills in a gap in our ability to quickly integrate data.  Prior to Alteryx, we relied on a combination of R scrips, SQL stored procedures and SQL Server Integration Services (SSIS) jobs to develop data integration solutions.  While this approach worked for us over the years, it had several drawbacks:

  1. It was a more “code-heavy” approach than we liked. While our Analytics team is comprised of competent coders and scripters, we seek to minimize the amount of code we generate (and maintain!)
  2. It was relatively slow and labor-intensive. A project that involved data integration took much longer to complete than a project that could be completed with “curated” data that already existed in our data warehouse and/or cubes.
  3. It was not very maintainable. Once a failure occurred or an enhancement was needed, dealing with code made it more difficult to get into “flow of things” compared to dealing with visual workflows.

 

One specific example is a repetitive analysis that we call “Fuel Savings Analysis” (FSA).  The goal of this analysis is to evaluate how much fuel our customers (commercial vehicle fleets) saved from drivers operating their vehicles differently after SmartDrive’s video event recorders were installed in the vehicles.  Because video event recorders activate in response to unsafe and abrupt maneuvers, drivers tend to avoid executing such maneuvers.  These maneuvers also often lead to fuel waste.  For example, harsh braking wastes more kinetic energy than gradually coasting down and using the kinetic energy (and not fuel) to overcome the rolling friction and aerodynamic drag. 

 

We had already developed a tool that automated the FSA analysis, utilizing stored procedures, R code, custom data cubes and Tableau.  However, the tool required several manual steps and needed to be run for one customer at a time.  As the result, SmartDrive’s Account Management team had to make a request of the Analytics team whenever the analysis needed to be run, and the Analytics team needed to expend 2 to 3 hours of effort for each request.

 

In April 2016, one month after we started using Alteryx, our Marketing team asked for the analysis to be done that assessed the fuel savings for all SmartDrive customers.  They were interested in including that statistics in an upcoming momentum press release.  Of course, this was not achievable with the existing tool, so we thought we would try to implement the workflow in Alteryx.  We were ultimately successful in being able to support this request, leading to the following paragraph being included in the April 12th, 2016 press release:

 

Saved customers an average of $4,903 per vehicle per year—with annual per vehicle savings of $1,878 in collision exoneration, $1,784 in collision cost reduction, and $1,240 in fuel expense


Describe the working solution

Our Alteryx workflow solution issues several queries against the data warehouse, with the primary (and the largest) query representing fuel consumption and distance driven for each customer vehicle and for each week that the vehicle was driven. This is combined with a dataset that tracks when each customer site was installed with SmartDrive, so that baseline and treatment period data can be separated. An R script that employs a decision tree (rpart) is used to group vehicles and is embedded within the workflow. The key calculation for the expected fuel consumption in the treatment period (e.g. scenario that removes the effect of SmartDrive) is calculated in Alteryx, and the resulting dataset is published on Tableau Server. We authored a Tableau workbook that implements additional calculations (e.g. % fuel savings, $ savings, etc.) and allows our Account Management team to create visuals that can be shared directly with the customer. The Alteryx workflow is scheduled to run weekly every Tuesday. In less than 30 minutes, the workflow processes the entire customer dataset, with the bulk of the time being spent waiting for the data warehouse to generate the vehicle-week extract. The entire workflow is shown in the image below.

 

 

Describe the benefits you have achieved

In this particular example, Alteryx allowed us to completely streamline a process that was already largely automated using other tools. While we could have invested more time to fully automate the existing tool, that would have involved so much effort that we have repeatedly decided to de-prioritize that work.

 

Now that we have a fully-streamlined process, our Account Management team is able to “pull up” the Fuel Savings Analysis visualization (“report”) on their own, with up-to-date results. Also, our Marketing team is able to report on the overall actual fuel savings realized by SmartDrive customers.

 

Beyond the Analytics team no longer needing to spend time and effort on running the Fuel Savings Analyses, this new capability allows our Account Management team to more consistently present the fuel savings results to our customers, particularly those that are still piloting SmartDrive. This leads to increased revenue from improved pilot conversion and also greater customer satisfaction stemming from the knowledge that their investment in SmartDrive service is generating positive financial returns.

Author: Mandy Luo, Chief Actuary and Head of Data Analytics

Company: ReMark International

 

Awards Category: Best Use of Predictive

As a trained Statistician, I understand why "70% data, 30% model" is not an exaggeration. Therefore, before applying any regression models, I always make sure that input data are fully reviewed and understood. I use various data preparation tools to explore, filter, select, sample or join up data sources. I also utilize the data investigation tools to conduct or validate any statistical evaluation. Next, I would usually choose 3-5 predictive modeling candidates depending on the modeling objective and data size. I often include one machine learning methods in order to at least benchmark other models. After the modeling candidates finish running, I would select the best model based on both art (whether the coefficients look reasonable based on my understanding of the data and business) and science (statistical criteria's like the goodness of fit, P-value and cumulative lift etc.).  I am also often using the render function for model presentation and scoring/sorting  function for model validation and application.

 

Describe the problem you needed to solve 

ReMark is not only an early adopter in predictive modeling for life insurance, but also a true action taker on customer centricity by focusing on customer lifetime analytics (instead of focusing on 'buying' only). In this context, we need to 'join up' our predictive models on customer response, conversion and lapse in order to understand the most powerful predictors that drive customer activities across pre and post sales cycle. We believe the industry understand that it is insufficient to only focus on any single customer activity, but is still exploring how this can be improved through modeling and analytics, which is where we can add value.

 

Describe the working solution

Our working solution goes with the following steps:

  1. Match over one year post sales tracking data back to sales payment data and marketing data (all de-personalized)
  2. Build 3 predictive models: sale(whether the purchase is agreed or not), conversion (whether the first premium bill is paid or not), 1 year persistency (whether lapse happened at month 13 or not).
  3. Compare model results by key customer segments and profiles
  4. Expert to visualization tool (e.g. Tableau) to present results
  5. Model use test: scoring overlay and optimization strategy

 

Describe the benefits you have achieved

  • We could create customer segments' not just based on tendency to 'buy', but also tendency to 'pay' and 'stay'.
  • We could further demonstrate ReMark's analytics and modeling capabilities covering the whole customer lifetime value chain without missing out

Amway logo.jpgAuthor: Adam Rant (@Rant) - Business Systems Analyst

Team Members: Tom Madden, Jordan Howell, Brian Conrad, Megan Lilley & Sankar Mishra

Company: Amway

Business Partner: Marquee Crew (@MarqueeCrew)

 

Awards Category: From Zero to Hero

 

Global Procurement at Amway began its journey with Alteryx by purchasing 2-licenses in October of 2015. We generated instant value from this tool, and knew there was so much untapped potential. A few short months after our initial purchase, we started hosting internal Alteryx enablement sessions to spread the word throughout our organization. It wasn’t long before we were up to 10 licenses. As our scope continued to expand, IT got involved and pursued a mini-trial that included Server for the remainder of 2016. In 2017 we purchased the full year pilot to drive even greater user adoption. Today, we have over 30 users and growing!


As our user base and experience with Alteryx grows, we have evolved from diverse data blending to language translation and normalization. We are moving from old legacy tools like Access and Excel, into the modern world of Analytics with Tableau and Alteryx. We are structuring data that we once thought to be impossible, and even built out applications to search online E-bay and now Amazon. We are pushing into the world of predictive analytics. Models are being developed and geospatial tool-sets are being examined. Most of these were pipe dreams before we were introduced to Alteryx. Now we are making these things come to life and blazing a trail for Analytics at Amway.

 

Describe the problem you needed to solve

We found Alteryx through Tableau. It started with simple Data blending and Tableau data automation, but grew from there.

  • Automating Manual Scorecards/metrics
  • Translation Macro using Google Translate
  • E-Bay Web Scrap
  • Amazon Web Scrap
  • Commodity Predictive Modeling
  • Spatial

 

Describe the working solution

We are using a wide range of data sources including excel, access, SQL Server (In DB tools), Oracle (In DB tools), SharePoint, Google Sheets, E-bay, and Amazon. Most of our data sets are published directly to Tableau Server. We have Server up and running to automate most of our Tableau Dashboards. Alteryx Gallery for deploying Apps is our next project to tackle. 

 

Describe the benefits you have achieved

Alteryx is the engine that is driving our team to new levels. We are automating all of our scorecards from a data perspective. We are able to provide daily insights on the health of our supply chain verses monthly reporting. Here are a few of the major projects we accomplished inside of Alteryx.

 

  1. Automated over 20 data processes, eliminating over 350 hours of data prep, and saving over $80,000 annually.
    • These savings are simply based on time savings. Factor into it the ability to run these workflows daily and deliver insights to our users and this is very conservative.
    • These workflows are now reusable processing engines that we can continue to enhance and build off of.
  2. Using Alteryx we are able to automate the translation of data. We operate in over 100 countries. Using Alteryx we developed a workflow that can go through our data and translate it automatically using Google translate. We plan on deploying this on our Server as an App for others to leverage.
  3. Jordan Howell eliminated a custom Access Database that cost us $3 Million dollars to build. This process eliminates 40 hours a month in manual data preparation due to the database, and will save us $24,000 a year. In 3 weeks he was able to recreate the database in Alteryx.
  4. E-bay & Amazon web scrapping to effectively audit Amway products that are being sold on these sites. Before Alteryx we manually accomplished this, and we would only get 1-10% of the total products on the sites (We had trouble answering questions at a Macro level about how many products). After Alteryx we can do this in seconds and have 100% of the products. Allowing users to focus on delivering insights from the data and not having to pull it!

 

I1.png

 

 I2.png

 

 I3.png

 

 I4.png

 

I5.png

 

 I6.png

 

 

amway2.png

Honeywell_logo.pngAuthor: Joseph Majewski - Commercial Finance Director

Team Members: Niki Yang, Larry Scates, Dan Trimble, Richard Haas, Wendy Edsall

Company: Honeywell Aerospace

Business Partner: Sherri Benzelock - VP Business Analytics

 

Awards Category: Best Use of Alteryx Server for Analytics Deployment

 

Alteryx has automated previous manual processes driving over 2000 hrs annual productivity to the Finance community. Alteryx server consolidates quarterly outlook commentary data from over 40 sources every 30 mins daily and publishes to Tableau server as a *.tde data source. Alteryx also loads extracted Essbase files to Tableau server nightly which are joined together for enhanced Tableau analytics providing insight to our quarterly forecast. 

 

Awards Category: Best Value Driven with Alteryx

 

Alteryx helped drive over 2000 annual hours of productivity and enabled over 250 users to migrate to Tableau based self- service analytics. Consumers are now better prepared for weekly Revenue outlook meetings as they have access to the data almost real-time versus waiting for Finance to prepare reports for the weekly forecast meetings. The new analytics developed in Tableau are now being referenced live in weekly meetings with business Presidents. 

 

Describe the problem you needed to solve

Significant manual effort was being done to compile business comments and associate them with the financial data on a management review dashboard in PowerPoint. Meetings were ineffective because only limited static views could be generated. The data presented was information overload and difficult to digest. There was no dynamic selection views so questions typically needed to be captured then answered and communicated after the meeting. Historical versions were saved in separate data sources and were not readily available for the reviews.

 

Alteryx flow to update the Comments which publishes the comments data source to both our Aero-Development and our Aero Certified-Production sitesAlteryx flow to update the Comments which publishes the comments data source to both our Aero-Development and our Aero Certified-Production sites

Describe the working solution

Under the new Alteryx solution, the compilation of weekly commentary files has been automated. Every week over 40 comment files are saved to a shared drive in *.csv format. Since submissions occur from all global regions, files are submitted at different times each week and are now consolidated more frequently (every 30 mins) allowing users across the globe to access the most updated information. The process also manages current week and historical weeks’ comments. We are also using Alteryx to cleanse the data removing all the error values and to correct business team data mapping issues. Alteryx also loads extracted Essbase files to Tableau server nightly which are joined together for enhanced Tableau analytics providing insight to our quarterly forecast.

 

Alteryx flow that brings in the new Essbase data (“SRO_Extract_Current.txt” and updates/appends/replaces the scenarios in the “SRO_Extract_History.txt” Data source file used by TableauAlteryx flow that brings in the new Essbase data (“SRO_Extract_Current.txt” and updates/appends/replaces the scenarios in the “SRO_Extract_History.txt” Data source file used by Tableau

Describe the benefits you have achieved

Utilizing Alteryx has resulted in initial savings of 2000+ hours per year by cleansing, compiling, and blending business commentary with financial data. Analysts are no longer manually compiling, copying, and pasting comments onto quarterly outlook presentations. Weekly review meetings became more effective as over 250 business leaders and analysts are better informed prior to as opposed to seeing the data in the meeting for the first time. This is allowing more efficient business meetings helping HON achieve its revenue growth targets.

The Alteryx dataset allows Honeywell Aerospace to slice and dice all information dynamically using Tableau which was not possible before. Questions are easily answered in the meetings because the data with variance, opportunity, and risk commentary is available prior to the meeting. Ad hoc questions are now researched live in meetings through our enhanced visualizations. Tableau’s dynamic reporting eliminated the need for Excel and PowerPoint reports thus reducing meeting preparation times as well. Business teams are now able to focus more on analytics than the data/presentation preparation. The solution we developed are making our business processes more contemporary, enabling business users quicker access in a format that is easier to consume on desktop and mobile devices ensuring they have the necessary information to make the right business decision.

 

honeywell3-A.png

 

Visual of Tableau Analytic with associated commentsVisual of Tableau Analytic with associated comments

Author: Katie Snyder, Marketing Analyst

Company: SIGMA Marketing Insights

 

Awards Category: Most Time Saved

 

We've taken a wholly manual process that took 2 hours per campaign and required a database developer, to a process that takes five minutes per campaign, and can be done by an account coordinator. This frees our database developers to work on other projects, and drastically reduces time from data receipt to report generation.

 

Describe the problem you needed to solve 

We process activity files for hundreds of email campaigns for one client alone. The files come in from a number of different external vendors, are never in the same format with the same field names, and never include consistent activity types (bounces or opt-outs might be missing from one campaign, but present in another). We needed an easy, user-friendly way for these files to be loaded in a consistent manner. We also needed to add some campaign ID fields that the end user wouldn't necessarily know - they would only know the campaign name.

 

Describe the working solution

Using interface tools, we created an analytic app that allowed maximum flexibility in this file processing. Using a database query and interface tools, Alteryx displays a list of campaign names that the end user selects. The accompanying campaign ID fields are passed downstream. For each activity type (sent, delivered, bounce, etc), the end user selects a file, and then a drop down will display the names of all fields in the file, allowing the user to designate which field is email, which is ID, etc. Because we don't receive each type of activity every time, detours are placed to allow the analytic app user to check a box indicating a file is not present, and the workflow runs without requiring that data source.

 

 

 

All in all, up to six separate Excel or CSV files are combined together with information already existing in a database, and a production table is created to store the information. The app also generates a QC report that includes counts, campaign information, and row samples that is sent to the account manager. This increases accountability and oversight, and ensures all members of the team are kept informed of campaign processing.

 

Process Opt Out File - With Detour:

 

 

Join All Files, Suppress Duplicates, Insert to Tables:

 

 

Generate QC Report:

 

 

Workflow Overview:

 

 

QC Report Example:

 

 

Describe the benefits you have achieved

In making this process quicker and easier to access, we save almost two hours of database developer time per campaign, which accounts for at least 100 hours over the course of the year. The app can be used by account support staff who don't have coding knowledge or even account staff of different accounts without any client specific knowledge, also saving resources. Furthermore, the app can be easily adapted for other clients, increasing time savings across our organization. Our developers are able to spend time doing far more complex work rather than routine coding, and because the process is automated, saves any potential rework time that would occur from coding mistakes. And the client is thrilled because it takes us less time to generate campaign reporting.

Author: Sintyadi Thong ( @MizunashiSinayu ), Pre-Sales Consultant, Karen Kamira & Harry Yusuf

Company: Lightstream Analytics PTE Ltd.

 

LightStream Analytics focuses on performance management, big data, advanced analytics, and innovative visualization through cloud SaaS applications, mobile, and traditional on-site systems. LightStream Analytics is well-positioned to deliver the most advanced products and services by capitalizing on its significant regional presence in Singapore and Indonesia. The combined offices have over 60 employees with deep technical and senior business experience. The company leverages our existing technical support and R&D centers in Indonesia and China to develop solutions which disrupt customary methods of data analysis and give clients access to revolutionary tools for understanding their data.  LightStream Analytics has partnered with more than 100 multinational and local clients to integrate, structure, analyze, and visualize information to measure their business performance and drive enterprise value growth.

 

Awards Category: Most Time Saved

  

Describe the problem you needed to solve 

 

One of our clients tried to implement one of the most Business Intelligence solution to help them grow their business through another company (we can pretty much say our competitor). However, there is one thing which hinders their development of the BI. When usually most companies want to see the date of sales (on which date their agents perform sales), this company would like to see the other way around, they would like to see ‘on which dates their agents do not perform sales activity’. For them this is very important. The BI developers hit a dead-end on this thing, and therefore I came with Alteryx.

 

Describe the working solution

 The BI I previously mentioned is QlikView. Well Qlik can do it, and I can guarantee. But it involves heavy scripting and logic. With heavy scripting it will also mean it requires heavy resources to perform the run (will only be visible when running with low RAM). Alteryx on the other hand can do this easily using drag-and-drop and repeatable workflow, so I feed Alteryx with the actual sales data, perform several left joins, filter, and unique. Alteryx requires no scripting, and to be honest I am not even an IT guy, I know nothing about SQL and programming, but I can create this workflow easily. So we proposed to have Alteryx prepare and blend the data before feeding it to QlikView, therefore it will help to make the data visible before feeding it to QlikView and lessens the burden on QlikView. While the client has not yet confirmed whether they will get Alteryx or not, it is really satisfying and rewarding to solve this problem easily while others had hardships in getting this result.

 

Describe the benefits you have achieved

While I created the workflow in only an hour vs their 2 weeks development for this one case (and in which they failed after 2 weeks), this shows how much of a time savings the client would get if they developed QlikView alongside with Alteryx. Alteryx will help the customers to get results faster and perform an advanced ETL which might be hard to do in traditional SQL language.

Author: Jeffrey Jones (@JeffreyJones), Chief Analytics Officer  

Company: Bristlecone Holdings

 

Awards Category:  Name Your Own - Most Entertaining (but Super-Practical) Use of Alteryx

 

Describe the problem you needed to solve 

Our marketing department needed a working Sex Machine, but that sort of thing was strictly prohibited in our technology stack.

 

Describe the working solution

Analytics built a functional Sex Machine! Let me explain...

 

Because our business involves consumer lending, we absolutely cannot -- no way no how -- make any kind of decisioning based on sex or gender. Regulators don't want you discriminating based on that and so we don't even bother to ask about it in our online application nor do we store anything related to sex in our database. Sex is taboo when it comes to the Equal Opportunity Credit Act. But the problem was that the marketing department needed better insight into our customer demographics so that they could adjust their campaigns and the messaging on our website, videos, etc., based on actual data instead of gut instinct.

 

Well, it turns out the Census Bureau publishes awesome (and clean) data on baby names and their sex. So we made a quick little workflow to import and join 134 years of births in the U.S. resulting in over 1.8 million different name/sex/year combinations. We counted the occurrences, looked at the ratio of M to F births for each and made some (fairly good) generalizations about whether a name was more likely a "Male" name or "Female" name. Some were pretty obvious, like "John." Others were less obvious, like "Jo." And some were totally indeterminate, like "Jahni."

 

Then we joined this brand new data set to an export of our 200k customer applications and were able to determine the sex of around 90% our applicants fairly reliably, another 7% with less reliability, and only 3% as completely unknown. The best thing about it is that we were able to answer these questions completely outside our lending technology stack in a manner disconnected from our decisioning engine so as to maintain legal compliance. We also didn't have to waste any money or time on conducting additional customer surveys.

 

This was literally something that was conceived in the middle of the night and had been born into production before lunch on the following day. (bow-chicka-bow-bow) Doing this wouldn't have been just impossible before Alteryx, it would have been LAUGHABLY IMPOSSIBLE. Especially given the size of the third-party data we needed to leverage and the nature of our tech stack and the way regulation works in consumer lending.

 

Describe the benefits you have achieved

It sounds silly, but our organization did realize tangible benefit from doing this. Before, we had no idea about a critical demographic component for our customers. It's impossible to look a bank of nearly 200k names across four totally unrelated industry verticals and conclude with any kind of confidence sex-related trends. Now we can understand sex-related trends in the furniture, bridal, pet, and auto industries. We can link it to the products they're actually getting and tweak the messaging on our website accordingly. And what's more, we're able to do all this in real-time going forward without wasting any of our DBAs' time or distracting our legal department. This probably saved us a hundred man-hours or more given all the parties that would have needed to get involved to answer this simple demographic question.

 

We should probably tidy up this workflow and the .yxdb because it might be useful for other companies who want to get a full demographic breakdown but don't have any pre-existing information on customer sex. If anybody wants to know the total number of people born with every name for the last 134 years and needs the M:F occurrence ratio for each, holler at me.

Author: Alex Huang, Asst. Mgr, Quality Planning & Analysis

Company: Hyundai Motor America

 

Awards Category: Most Time Saved

 

There have been just a few times where some tool or platform has truly "changed" my life.  The two that come immediately to mind are Alteryx & Tableau.  Before I had either, the majority of my time was spent wrangling data, creating reports, and doing what I could using SAS, SQL, & Excel.  I had streamlined as much as I could and still felt bogged down by the rudimentary data tasks that plague many of us. 

 

With the power of Alteryx alone, I've regained 1,253 hours per year.  Alteryx WITH Tableau has saved me an additional 511 hours to a total of 1,764 hours saved per year!  Does that mean I can quit?  Maybe…but I’m not done yet!

 

For those that care for the details, here's a table of time savings I had cataloged during the start of my Alteryx journey.  I’ve had to blank out the activity names for security reasons but the time savings are real.

 

 

I experienced a 71% savings in time with Alteryx alone!

 

With this new found "free time," I was able to prototype ideas stuck on my To-Do list and create new insight for my business unit.  Now my "what if's" go from idea straight to Alteryx (and to Tableau faster) and I couldn't be happier.  Insights are delivered faster than ever and with more frequent (daily) updates thanks to Alteryx Desktop Automation.

 

Describe the problem you needed to solve

Hyundai Motor America sells thousands of cars per day so the faster we can identify a quality issue and fix it, the more satisfied our customers will be.  Addressing quality concerns earlier and faster helps us avoid additional costs but most importantly brand loyalty, perceived quality, and vehicle dependability, etc.  Some examples of actions:

 

  1. Increased the speed at which we validate and investigate problems from survey data resulting in faster campaign launches and remedy development.
  2. Able to digest and understand syndicated data from J.D. Powers within hours instead of weeks allowing us to further validate the effectiveness of our prior quality improvement initiatives and also identify issues we missed.
  3. Being able to blend all the data sources we need (call center, survey data, repair data, etc.) in Alteryx allowed us to more rapidly prototype our customer risk models vs. traditional methods via SAS which took much longer.
  4. Alteryx automation with Tableau allowed us to deploy insight rich interactive dashboards that enabled management to respond to questions in real-time during our monthly quality report involving many major stakeholders throughout Hyundai.  This lead to more productive meetings with more meaningful follow-up action items.

 

I needed to solve a time problem first!  I was spending too much time doing things like data prep and reporting that just wasn’t quite enough for me.  I didn't have enough time to do what I really wanted to do, solve problems!

 

Being an avid fan/user of Tableau, data preparation started becoming my biggest challenge as my dashboard library grew.  I would end up writing monster SQL statements and scripts to get the data ready but I still struggled with automation for creating Tableau Data Extracts (TDE's). I explored using Python to create them but it just wasn't quite the "desired" experience.  Enter Alteryx, life changed.

 

 

Describe the working solution

My work typically involves blending data from our transactional data warehouse, call center data, survey data, and blending third-party data from companies like J.D. Powers.  Since we have an Oracle database in-house, I'm able to leverage the In-DB tools in Alteryx which is just amazing!  In-DB tools are similar to a "visual query builder" but with the Alteryx look, feel, and added capability of Dynamic Input and Macro Inputs.  Since data only moves out of the DB when you want it to, queries are lightning fast which enable accelerated prototyping ability!

 

Describe the benefits you have achieved

I've quite literally freed up 93% of my time (given 1960 work hours per year with 15 days of vacation @ 8 hours per day) and started a new "data team" within my business unit with Alteryx & Tableau at its core.  The ultimate goal will be to replicate my time savings for everyone and “free the data” through self-service apps.  At this point, I’ve deployed 5,774 Alteryx nodes using 61 unique tools in 76 workflows of which 24% or so are scheduled and running automatically.  Phew!  Props to the built-in “Batch Macro Module Example” for allowing me to calculate this easily!

 

 

We are able to identify customer pain points through an automated Alteryx workflow and algorithm that gauges how likely an issue will persist across all owners of the same model/trim package.  We’ve seen how blending Experian ConsumerView data bolsters this model but we’re still in the cost justification phase for that.  Upon detection of said pain point, we are able to trigger alerts and treatments across the wider population to mitigate the impact of this pain point.  Issues that can’t be readily fixed per se are relayed back to R&D for further investigation.  Ultimately customers may never see an issue because we’ve addressed it or they are simply delighted by how fast we’ve responded even when no immediate remedy is available.

 

The true bottom line is that the speed and accuracy at which we execute is critical in our business.  Customers want to be heard and they want to know how we are going to help resolve their problems now, not months later.  They want to love their Hyundai’s and the more they feel like we are helping them achieve that, the more loyal they will be to our brand.

 

Although we can’t fix everything, Alteryx helps us get to where we need to be faster which; in my opinion, is an enabler for success.

Author: Brett Herman ( @brett_hermann ) , Project Manager, Data Visualization 

Company: Cineplex

 

Cineplex Inc. (“Cineplex”) is one of Canada’s leading entertainment companies and operates one of the most modern and fully digitized motion picture theatre circuits in the world. A top-tier Canadian brand, Cineplex operates numerous businesses including theatrical exhibition, food service, amusement gaming, alternative programming (Cineplex Events), Cineplex Media, Cineplex Digital Media, and the online sale of home entertainment content through CineplexStore.com and on apps embedded in various electronic devices. Cineplex is also a joint venture partner in SCENE – Canada’s largest entertainment loyalty program. 

 

Awards Category: Most Time Saved

 

Describe the problem you needed to solve 

Incremental/Uplift Modelling is a popular method of evaluating the success of business initiatives at Cineplex. Its effectiveness at measuring the change in consumer behavior over time creates a high demand to produce this kind of analysis for various departments in the organization. Due to the large amount of requests we receive, the ‘Incremental Lift Model’ was developed to take in user-defined inputs, and output the results within a short period of time.

 

Describe the working solution

Our solution works through a four step process. The first step is for the client to populate the ‘study input form’ in order to define their study parameters and the type of study they want to run.

 

Visual 1: Study Input Form

 

The second step is to update/materialize our loyalty data that’s inputted into the model (yxdb format). We do this so that the model doesn’t put stress on our SQL Server databases, and to increase the model’s performance.

 

Visual 2: Update/Materialize Alteryx Input Data

 

The third step is the core of the incremental lift modelling. A macro processes one study at a time by pointing to the user defined inputs made in the first step.

 

Visual 3: Study Numbers are selected and passed through the incremental lift macro, and saves the output to SQL.

 

The data will then be passed through one of several macros depending on the study type, and filtered down based on the inputs defined by the user in the study input form. All data sources are joined together and lift calculations are made, which are then outputted into a master SQL Table ready to be visualized.

 

Visual 4: Incremental Lift Modelling based on study type selected.

 

The results are visualized using a Tableau Dashboard in order to share and communicate the results of the study back to the business clients.

 

Visual 5: Tableau Dashboard to explain to the business how the incremental lift model makes its calculations.

 

 

 

Describe the benefits you have achieved

The overarching goal of this project was twofold; to minimize the amount of work required to process business requests while maximizing the output generated, and to develop a means of delivering the results in a consistent manner. Both of these goals contribute greatly to our ROI by virtually eliminating all time spent executing requests that come in, and by minimizing time spent meeting with business users to explain how the incremental lift model works and how to interpret the results.

 

K-LOVE_logo.pngAuthor: Bill  Lyons  - Principal Data Scientist

Team Members: Trudy Fuher, Alana Welz, Arlyn Baggot

Company: Educational Media Foundation

 

Awards Category: Best ‘Alteryx For Good’ Story 

The initial project has the potential to save this non-profit organization up to $2.2 million per year in streaming costs when recommendations are fully implemented. Other use cases improve internal efficiencies, communication, and productivity.

 

Awards Category: Best Use of Alteryx Server for Analytics Deployment

Alteryx Server automatically processes daily file downloads, weekly file downloads with decompression, decryption and bulk insertion, and monthly zip code DMA assignments. Other use cases support self-service imports, exports and reporting.

 

Awards Category: Best Use of Alteryx for Spatial Analytics

Alteryx spatial tools combined with Alteryx data is driving optimization of regional streams associated with DMAs.

 

Awards Category: Best Value Driven with Alteryx

Optimizing regional streams has resulted in at least $500,000 in savings since July 2016, with recommendations implemented so far. When all recommendations are fully implemented, savings could be $2.2 million per year or more.

 

Awards Category: From Zero to Hero

Even though we purchased our first Designer license in June 2015, as of early March 2016, we had not created a single workflow with Alteryx. We were considering not renewing our license. At that time, we got a new rep, Nick Glassner, who arranged for a couple of WebEx sessions with Alteryx Solutions Engineer Ali Sayeed to get us started on a real project. Within a few weeks, I recognized many more potential applications for Alteryx, and was off and running. I changed from a skeptic to an enthusiastic user. Analysis for this project began in mid-April and was completed in mid-May. We acquired Alteryx Server in June, and had the first phase of the implementation of this project running on a daily schedule by August. Other phases came online in November and in January 2017.

 

At that point, I was still the only person using Alteryx heavily in analysis and production. So, I began some internal workshops showing how to solve real-world problems with Alteryx. We now have 3 more internal users becoming productive with Alteryx, and are looking to hire another. Some of these users are also taking advantage of the “Enablement Series” offered by our new rep, Tim Cunningham.

 

Describe the problem you needed to solve

Initial business problem: Recent regulatory changes caused our national internet radio streaming costs to more than double, from less than $1 million to over $2 million annually. The goal was to find ways to optimize our streams to move usage from the national stream to our underutilized regional streams, and thus reduce our costs.

 

Other use cases, including their business challenges, solutions, and benefits, follow the solutions and benefits of this initial business problem.

 

Describe the working solution

Alteryx played a major role in analysis of the streaming data. Some of the regional streams were underutilized, while others exceeded their cost effective limits, so the first phase was to analyze the accuracy of IP address geolocation software to see what would be causing this. The website systems and the log analytic systems used different IP geolocation software (the websites used IP2Location, and the analytic systems used Maxmind) so we needed to know if one was better than the other, or if neither was adequate. However, these system are isolated from each other by firewalls, making direct comparisons impossible. Alteryx Designer allowed me to connect to three different SQL Server database systems and compare their data with a .csv file from another vendor being evaluated (NetAcuity).

 

This analysis made extensive use of Alteryx spatial matching and Alteryx spatial data, visualizing results with Tableau. It revealed some disturbing facts, including that the geolocation was very inconsistent between the systems. As an example, we found that less than half of the listeners to the New York City stream were even in the NYC DMA (Figure 1).

 

Figure 1Figure 1

 

Additionally, we learned that only a little more than half of the listeners in the NYC DMA were listening to the NYC stream. (Figure 2)

Figure 2Figure 2

 

 The analysis also compared actual registered listener locations to the location reported by the various services. This showed that IP2Location was clearly inferior. (Figure 3)

 

Figure 3Figure 3

But Maxmind returned a significantly higher number of unknown locations, both within the US, and even identifying the country. (Figure 4)

 

Figure 4Figure 4

 

The analysis concluded with 16 recommended changes to systems, software, programming and contracts.


One of those recommendations was to unify both the websites and the analytics on the same and most consistently accurate IP address geolocation provider: NetAcuity. Alteryx supports the updates to the NetAcuity database by downloading the data from NetAcuity, decompressing, decrypting, and bulk inserting it into SQL Server. It does this on a weekly schedule in Alteryx Server, each time moving roughly 40 million rows of data in about an hour.

 

Primary workflow:KLOVE-5.png

 

Supporting macros:KLOVE-6.png

 

 

An Alteryx Server scheduled app then builds Calgary databases of the IP geolocation data.KLOVE-7.png

 

 

Next, another Alteryx Server scheduled app applies that geocoding to the streaming log data.KLOVE-8.png

 

 KLOVE-9.png

 

 

Alteryx spatial data also supports Server scheduled monthly updates to keep zip to DMA to stream assignments up to date.

 

Describe the benefits you have achieved

4 of the 16 recommendations have been implemented to date, saving over $500,000 since last July, and an estimated $700,000 for 2017. More steps are in development, with a goal of saving $2 million per year.

 

Never before did we have a reliable and up-to-date zip code to DMA assignment process. We previously bought zip code to DMA data from Nielsen, but it was incomplete and quickly out-of-date.

 

Other Significant Alteryx Use-Cases

 

1. Transmitter location identification

  • Business Challenge: Property tax filings must be made with the appropriate jurisdiction for the location of the property. With normal property, the street address easily identifies that jurisdiction. However, radio transmitter sites are frequently in very remote locations where there is no street address, and frequently on tops of mountains, within a few feet of jurisdictional boundaries. Historically, property tax accountants manually used transmitter location geographic coordinates to search maps to identify state and county with which to file property tax forms. This very laborious process took a team of 3 or 4 people up to 8 weeks each year, and was fraught with error.
  • Solution: Alteryx Server scheduled app performs spatial match between transmitter geographic coordinates and Alteryx spatial data, precisely and accurately identifying and coding transmitter location state and county. Run time: about 15 seconds per day, automatically. This simple workflow took only a couple of hours to build and deploy.KLOVE-10.png

     

  • Benefit: Savings of up to 8 man-months of manual labor per year. Reduction in errors (this process identified more than 200 instances where the location was either undocumented or in the wrong jurisdiction; 2 were even in the wrong state).

 

2. Log file FTP download

  • Business Challenge: The system downloading new log files from content delivery network (CDN) daily was very fragile, requiring manual checks and restarts every few days.
  • Solution: Alteryx workflow app, scheduled to run daily, downloads list of available files, compares list to list of previously downloaded files, downloads new files, updates list of files downloaded.K-LOVE-11.png

     

     KLOVE-12.png

     

 

 

  • Benefit: Alteryx job has run without error for 8 months. Saves time (about an hour per week) monitoring and maintaining each week, but it is mostly a huge reduction in the "hassle factor." Time to develop was less than a couple months’ worth of manual corrections.

 

3. User import of Excel into SQL Server

  • Business Challenge: Data files from mobile app vendors come each month in Excel files and need to be imported to SQL Server. This import required a DBA to manually import, and was consequently a year behind.
  • Solution: Gallery app allows users to upload files themselves, automatically removes duplicate data, reports duplicates ignored, structure errors, and data imported.KLOVE-13.png

     

     

 

  • Benefit: Self-service of data import relieves workload of DBAs and allows users to have immediate reporting of data in Tableau. This process also revealed that the supplier had duplicate records that overlapped between months. This had created erroneous data of which we had not previously been aware.

 

4. Tealium reporting

  • Business Challenge: Connecting Tableau directly to Redshift was slow.
  • Solution: In-Db tools query Redshift database, filter, aggregate, and download to Tableau Server Data Source Extract. App is scheduled in Alteryx Server. 

     

    KLOVE-14.png
  • Benefit: Faster Tableau reports

 

5. Studio automation logs

 

  • Business Challenge: Log files have been inconsistent and incomplete, with gaps and overlaps, making downstream reports unreliable.
  • Solution: Download tool connects directly to REST API of studio automation software, parses the JSON, and inserts into SQL Server data warehouse. Scheduled in Alteryx Server daily.KLOVE-15.png

     

     

  • Benefit: Reliable data for reporting.

 

6. Record of donor communication

 

  • Business Challenge: Producers call donors to record their stories, logging that call in Google Sheets. Donors call back, talking to communicators in the Listener Services department who have no visibility to the Google Sheets, and there was no record in the donor system. Awkward conversations ensued.
  • Solution: Alteryx Server app scheduled to run every 5 minutes connects to Google Sheet, downloads the call records and insert records into the SQL Server donor system of record.KLOVE-16.png

     

  • Benefit: Listener Services communicators can now intelligently communicate with donors.

 

Author: Thomas Ayme, Manager, Business Analytics

Company: Adidas International Trading B.V

 

Awards Category: Name Your Own - Best Planning and Operational Use

 

Describe the problem you needed to solve 

As a new and successful business adidas Western Europe eCommerce keeps on growing faster and faster; new services are being launched every week, an increasing number of marketing campaigns are being driven simultaneously, etc. This leads to more and more products having to be shipped out every day to our end consumers.

 

This strong growth leads to an exponential increase of the complexities when it comes to forecasting our units and orders volumes, but also to bigger costs in case of forecasting mistakes or inaccuracies.

 

As these outbound volumes keep on increasing, we were being faced with the need to develop a new, more accurate, more detailed and more flexible operational forecasting tool.

 

Such a forecasting tool would have to cater to the complexities of having to forecast for 17 different markets rather than a single pan European entity. Indeed, warehouse operations and customer service depend on a country level forecast to plan carriers and linguistic staff. This is a very unique situation where on top of having a rapidly growing business we have to take into account local marketing events and markets specificities.

 

Finally, given the importance of ensuring consumer satisfaction through timely delivery of their orders, we also decided to provide a daily forecast for all 17 markets rather than the usual weekly format. Such a level of details improves the warehouse's shipping speed but also increase once again the difficulty of our task.

 

Describe the working solution

 

Our first challenge was to find reliable sources of information. Both business analytics (financial and historical sales data) and web analytics (traffic information) data were already available to us through SAP HANA and Adobe Analytics. However, none of our databases were capturing in a single place all information related to marketing campaigns, project launches, events, adhoc issues, etc.

 

That is why we started by building a centralized knowledge database, which contains all past and planned events that can impact our sales and outbound volumes.

 

This tool is based on an Alteryx workflow, which cleans and blends together all the different calendars used by the other eCommerce teams. In the past, bringing those files together was a struggle since some of them are based on Excel while others are on Google Sheets, moreover, all are using a different format.

 

 

We made the best of this opportunity of now having a centralized event database by also developing a self-service visualization tool in Tableau, which displays all those past and future events. Such a dashboard is now used to:

 

  1. Give some background to our stakeholders about what is driving the volumes seen in the forecast.
  2. Have an overview of the business during our review the sales targets of the coming weeks, etc...

 

In a second time we created a workflow, which thanks to this new centralized event database, defines for each past and upcoming days as well as for each markets a set of "genes". These genes flag potential adhoc issues, commercial activations, level of discount, newsletter send outs, etc.

 

This gene system can then be used to define the histoical data to be used to forecast upcoming periods, by matching future and past days that share the same or at least similar genes. This is seen as the first pillar of our forecasting model.

 

The second pillar of our forecasting tool is a file containing our European weekly targets. These targets are constantly being reviewed based on new events shown in the centralized event database and current business trends. 

An Alteryx workflow derives from this target file our sales expectation for each upcoming day, market, category (full price, clearance) and article type (inline or customized). In order to do so, we use historical data defined by our genes in addition to a set of algorithms and calculate the sales impact ratio of each market and category. These ratios are then used to allocate a target to each one of the combination.

 

 

Finally, both pillars are brought together and we derive in a final Alteryx workflow, how many orders and units will have to be placed in each markets and from which article type.

 

However, since certain periods of time have genes combinations that cannot be matched, our working solution also gives us the flexibility to manually override the results. These forecast volumes are then shared with the team, warehouse, customer service call centers, etc. through a Tableau dashboard.

 

 

Describe the benefits you have achieved

Thanks to the work that went into developing this new forecasting model in Alteryx, the adidas WE eCommerce business ended up getting:

 

  • A more accurate forecasting model, which allows for a better planning of our operations.
  • Reduced operational costs.
  • A more detailed forecast as we can now forecast on a daily level, when past methods required much more work and limited us to a weekly forecast.
  • A flexible forecasting model that can easily be modified to include new services and sales channels.
  • A forecast dashboard that lets us easily communicate our forecast to an ever growing number of stakeholders.
  • A centralized event “calendar” that can be used by the entire department for much more than simply understanding the forecast (e.g. it is used to brief in Customer Service teams on upcoming events).
  • A massive amount of free time that can be used to drive other analyses, as it is not required from us anymore to manually join together different marketing calendars and other sources of information, create manual overviews of the upcoming weeks, manually split our weekly sales target, etc.

Ford logo.pngAuthor: Luojiao Shen (@Beta) – Business Analyst

Team Members: Dan Totten, Celia Ortiz, David Harris, Matt Shanku, Mario Beasley, Rong Jing

Company: Ford Motor Company

 

 

Awards Category: From Zero to Hero

 

Our Ford IT Analytics Architecture Team started to work on an Enterprise Technology Refresh Program in Feb 2016. The objective was to simplify the technology footprint and the tech renewal process across the Enterprise in support of our IT Strategic Initiatives.


We use Alteryx to blend dozens of data sources in order to prepare the data for analysis and visualizations. Our team was trained within the company and able to use Alteryx in a week. We use Alteryx to visualize the current state of technology and application portfolios in order to gain insights. Additionally, we perform data analysis for resource prioritization and planning of Tech Refresh activities.

 

Describe the problem you needed to solve

Initial Business problem: In order to visualize the Technology Footprint, it was necessary to pull data from multiple data sources across our large, complex IT organization. Technology is moving faster than ever before. Ford IT needs to stay ahead of this rapid pace in order to satisfy its customers with premium information based services and exceptional mobility products. Additional use case: Technologies of servers, applications, appliances, database components, hosting landing zones, etc. need to align with rapid delivery of capabilities to the business

 

Describe the working solution

Our inputs came in every imaginable format. Dozens of repeatable workflows and macros were created to blend, ingest and process data to drive sound decision making. InDB tools were utilized for Hadoop Big Data transformation and output to visualizations in Tableau and QlikView.

 

Describe the benefits you have achieved

Our data blending processes are now automated using Alteryx macros to generate multiple reports and visualizations. Alteryx enabled self-service and object-based trouble shooting, data preparation and blending.

 

The user friendly interface helps us to rapidly modify or create new workflows.

  • Time saving – processing time is reduced by 80%
  • Reduction of errors – is experienced due to the process now being automated
  • Business customer satisfaction has increased due to improved efficiency, quality and rapid delivery of actionable metrics

 

ford1.png

 

 

ford2.png

 

 

ford3.png

 

 

ford4.png

 

 

 ford5.png

 

 

 ford6.png

 

 

 ford7.png

Author: Michael Barone, Data Scientist
Company: Paychex Inc

Awards Category: Best Use of Predictive

 

Describe the problem you needed to solve

Each month, we run two-dozen predictive models on our client base (600,000 clients). These models include various up-sell, cross-sell, retention, and credit risk models. For each model, we generally group clients into various buckets that identify how likely they are to buy a product/leave us/default on payment, etc. Getting these results into the hands of the end-users who will then make decisions is an arduous task, as there are many different end-users, and each end-user can have specific criteria they are focused on (clients in a certain zone, clients with a certain number of employees, clients in a certain industry, etc.).


Describe the working solution

I have a prototype app deployed via Alteryx Server that allows the end-user to “self-service” their modeling and client criteria needs. This is not in Production as of yet, but potentially provides great accessibility to the end-user without the need of a “go-between” (my department) to filter and distribute massive client lists.

 

Step 1: ETL

  • I have an app that runs every month after our main company data sources have been refreshed:

51.png

This results in several YXDBs that are used in the models. Not all YXDBs are used in all models. This creates a central repository for all YXDBs, from which each specific model can pull in what is needed.

  • We also make use of Calgary databases as well, for our really large data sets (billions of records).

52.png

Once all the YXDBs and CYDBs are created, we then run our models. Here is just one of our 24 models:

53.png

  • Our Data Scientists like to write raw R-code, so the R tool used before the final Output Tool at the bottom contains their code:

54.png

The individual model scores are stored in CYDB format, to make the app run fast (since the end-user will be querying against millions and millions of records). Client information is also stored in this format, for this same reason.

 

Step 2: App

  • Since the end-user will be making selections from a tree, we have to create the codes for the various trees and their branches. I want them to be able to pick through two trees – one for the model(s) they want, and one for the client attributes they want. For this app, they must choose a model, or no results will be returned. They DO NOT have to choose client attributes. If no attribute is chosen, then the entire client base will be returned. This presents a challenge in key-building, since normally an app that utilizes trees only returns values for keys that are selected. The solution is to attach keys to each client record for each attribute. My module to build the keys in such a way as I described is here (and there will be 12 different attributes from which the user can choose):

545.png

  • Here is what the client database looks like once the keys are created and appended:

56.png

  • The model keys do not have to be as complex a build as client keys, because the user is notified that if they don’t make a model selection, then no data will be returned:

57.png

  • Once the key tables are properly made, we design the app. For the model selection, there is only one key (since there is only one variable, namely, the model). This is on the far right hand side. This makes use of the very powerful and fast Calgary join (joining the key from the pick-list to the key in the model table). For the client table, since there are 12 attributes/keys, we need 12 Calgary joins. Again, this is why we put the database into Calgary format. At the very end, we simply join the clients returned to the model selected:

58.png

 

Step 3: Gallery

  • Using our private server behind our own firewall, we set up a Gallery and Studio for our apps:

59.png

  • The app can now be run, and the results can be downloaded by the end-user to CSV (I even put a link to an “at-a-glance” guide to all our models):

591.png

  • The user can select the model(s) they want, and the scores they want:

592.png

And then they can select the various client criteria:

593.png

Once done running (takes anywhere between 10 – 30 seconds), they can download their results to CSV:

594.png

 

Describe the benefits you have achieved

Not having to send out two dozen lists to the end-users, and the end users not having to wait for me to send them (can get them on their own).  More efficient and streamlined giving them a self-service tool.

Author: Michael Peterman, CEO 

Company: VeraData

 

Awards Category: Best 'Alteryx for Good' Story

 

We provide deep analytics services for hundreds of clients.  Of particular interest is the NCCS (National Childrens Cancer Society).  This prestigious and highly respected organization has been doing more for the families of children with cancer since 1987 - yep, for almost 30 years.  We are honored to be serving them as a client.

 

Describe the problem you needed to solve 

NCCS, like every other large charity in America, sends out direct mail fundraising solicitations to support these families.  Like any other business has to spend money to acquire new customers, non-profit organizations spend money to acquire donors.  They were faced with a year over year trend of increasing donor acquisition costs and increasing costs to reactivate lapsed donors.   This was coupled with a concern was that there was a shrinking universe of potential donors who were willing to support their efforts.

 

Describe the working solution

Enter VeraData. Our initial engagement with NCCS was to build a donor acquisition model to reduce their costs to acquire donors, which subsequently reduces the cycle time to break-even on the investment in new donors. Concurrently, we developed a lapsed reactivation model that used tons of external, outside information to select from their audience of former donors the individuals most likely to donate again, therefore increasing the universe of marketable names while maintaining the cost to reactivate. Lastly, our third component was to uncover an expanded universe of individuals who had the propensity to support the NCCS. This meant identifying new data sets and determining which individuals would be profitable to pursue.

 

There were several methodologies deployed to achieve these goals. Our analytic team settled on a series of support vector machine models solving for response rate, donation amount, package and channel preferences, etc. All of the information in our arsenal was called upon to contribute to the final suite of algorithms used to identify the best audience. Using Alteryx, R, Tableau and our internal machine learning infrastructure, we were able to combine decades worth of client side data with decades worth of external data and output a blended master analytic database that accounted for full promotional and transactional history with all corresponding composite data on the individuals. This symphony achieved all of the goals, and then some.

 

 

Describe the benefits you have achieved

The client experienced a 24% reduction in their cost to acquire a donor, they were able to reactivate a much larger than anticipated volume of lapsed donors (some were inactive for over 15 years) and they discovered an entirely new set of list sources that are delivering a cost to acquire in line with their budget requirements. Mission accomplished.

 

Since that point, we have broadened the scope of our engagement and are solving for other things such as digital fundraising, mid-level and major donors. Wouldn't have been possible to do with the same speed and precision had we not been using Alteryx.

Author: Qin Lee, Business Analyst

Company: MSXI

 

Awards Category: Most Unexpected Insight

 

Huge data, large file and multiple applications have been created and saved and shared in a small size of Alteryx file. And now, I can test the script/coding and find the errors. This is the good way to develop the proof of concept for our company.

 

Describe the problem you needed to solve 

We need to go through many applications to get the data and save into one location to share and view.

 

Describe the working solution

We are blending the data sources form SQL, Access Excel and Hadoop, Yes, we are leveraging many parties' data. We are developing the workflows and functions for a concept now. Yes, we are exporting to a visualization tool.

 

 

 

Describe the benefits you have achieved

Collected the data from many locations and saved into a small size of the Alteryx database file and created the workflow and function and developed a search engine and design the proof of concept for approval and launch. Saved time and resolved the problem and increased customer satisfaction. I would like to send my sincere thanks to Mr. Mark Frisch (@MarqueeCrew), who helped us for many days to finish this project.