Free Trial

Past Analytics Excellence Awards

Suggest an idea

Author: Andrew Simnick, Head of Strategy

Company: Art Institute of Chicago

 

Awards Category: Best 'Alteryx for Good' Story

 

The Art Institute of Chicago is one of the world's greatest art museums, staying true to our founding mission to collect, preserve, and interpret works of art of the highest quality from across the globe for the inspiration and education of our audiences. Today we face new competition for visitor attention, a continued responsibility to expand our audiences, and an increasingly-challenging economic environment.  Alteryx has allowed us to quickly overcome our data and resource constraints, develop a deeper understanding of our local audiences, and strike a balance between mission- and revenue-driven activities to continue to deliver on our mission for Chicago.

 

Describe the problem you needed to solve 

We, as do other museums, face the challenges of growing our audience while  maintaining a strong financial foundation. Our strategy to navigate this has been to increase visit frequency from our core visitor segments in the near term and use this increase to further expand outreach to new local audiences. However, our challenges to achieving this have been three-fold. First, visitor segmentation in the arts and culture space is a relatively recent concept, and general segmentation schema are not always applicable to Chicago at a granular level. Second, we have very useful data but in inconsistent formats, ranging from handwritten notes and Excel documents to normalized but disconnected databases. Third, we are resource-constrained as an institution and cannot dedicate large amounts of time or money towards dedicated analytics or external consulting.

 

Describe the working solution

First, we built a database describing the Chicago CBSA at the census block group level, providing the nuance necessary for a city where demographics change block-to-block and limit the utility of ZIP code analysis. Alteryx allowed us to get to this new additional level of detail and make our analysis relevant to Chicago. Using the Allocate Input and Calgary Join, we applied information from the US Census as well as Experian data sets. We utilized basic data such as population, income, and education, as well as proprietary Experian segments such as Mosaic groups and ISPSA (Index of Social Position in Small Areas) to describe these census blocks.

 

Second, we brought together our disparate visitor data into a blendable format. Some of our datasets are well defined, such as our membership CRM which resides in a relational database on MSSQL Server, whereas others are more ad hoc, such as our Family Pass users, which are transcribed from pen and paper into an Excel document. The Join tools in Alteryx provided a simple way to bring these data together without commanding significant time from our small analytics team.

 

Third, each of these datasets was CASS Encoded and geocoded using the US Geocoder tool, providing us a spatial object. We then utilized the Spatial Match tool to find the intersection of these objects with our universe of Chicagoland Block Groups. Each of these distinct streams were then normalized and combined to the block group aggregation level resulting in our final dataset. We also utilized a shared public custom macro which allowed us to convert these block groups into polygons for visualization in Tableau.

 

Finally, we utilized heatmaps and scatterplots to identify which proprietary Experian segments correlate with our different offerings. This informed our choice of variables for our final Decision Tree tool analysis, which identified prime target block groups associated with our different offerings. These bespoke segments created via machine learning were more applicable to our own audiences and required a fraction of the time and cost of other segmentation methods.

 

Describe the benefits you have achieved

This approach has given us a framework and the supporting intelligence from which to make institutional decisions surrounding visitor outreach and programming, allowing us to focus our resources on actions which we believe will have the greatest impact towards increased participation, attendance, and/or revenue. For example, we can now tailor membership messaging more effectively and quantify the effects on repeat visitation. We also can identify gaps in our geographic coverage of Chicagoland and test different outreach efforts to engage new audiences. Most importantly, we can unify our approach to audience development across departments using a common baseline and methodology. These combined efforts enabled by Alteryx will help us to build our audiences and fulfill our civic responsibilities well into the future.

Author: Andrew Kim, Analyst (@andrewdatakim)

 

Awards Category: Name Your Own - Scaling Your Career with Alteryx

 

Describe the problem you needed to solve 

Deciding on a tool to invest your time in is a problem everyone faces in their career. Learning to blend the tools given to us in college versus what the professional world is actually using are starkly different. I have quickly discovered to have a career that has both the opportunity to start from a company from scratch and the flexibility to work in a Fortune 100 environment requires the knowledge of assets that can scale without a significant investment of time or money.  My background is in Marketing and Finance with most of my work experience in small to midsize companies where every person is required to be/do more for the company to survive.

 

Describe the working solution

I set out to find these tools 3 years ago with the understanding that information drives a business, which lead me to Gartner report. I went through trials of a dozen different options and even had contracted assistance from a developer of one of the software options. Alteryx quickly became my option of choice which greatly contributed to my  previous company's growth from $250k in annual revenue online to $12 million in 2 years. The ability to access multiple data source types, leverage Amazon MWS data and use historical competitive landscape information allowed us to create the perfect dashboards in Tableau to analyze current inventory and buying opportunities that were previously inconceivable.  I was able to save 10,000 labor hours a day in discovering new products. Prior to Alteryx being purchased the average buyer's assistant could run 200 Amazon listings per 8 hour day. After Alteryx we were retrieving over 250,000 listings per run multiple times a day (The math: 250,000/25 listings per hour=10,000 hours per run). The primary customer in this scenario were the buyers for the company. By taking all of the data processed through Alteryx and providing them with Tableau dashboards to conveniently view current and historical product information compared to the previous Excel models we were able to maximize inventory turnover and margins.

 

Describe the benefits you have achieved

Alteryx knowledge allowed me to advance to my current company and position in a Fortune 50 company where I am a Data Analyst/Programmer. I now work heavily with survey data and again Alteryx has proven an indispensable asset even with the change in scale. Its versatility has allowed all of my skills to transfer from operational data to qualitative without skipping a beat. I find Alteryx is an asset that has only increased my passion for data and I am eager to see how I can continue to scale my career with it.

Author: Thomas Ayme, Manager, Business Analytics

Company: Adidas International Trading B.V

 

Awards Category: Name Your Own - Best Planning and Operational Use

 

Describe the problem you needed to solve 

As a new and successful business adidas Western Europe eCommerce keeps on growing faster and faster; new services are being launched every week, an increasing number of marketing campaigns are being driven simultaneously, etc. This leads to more and more products having to be shipped out every day to our end consumers.

 

This strong growth leads to an exponential increase of the complexities when it comes to forecasting our units and orders volumes, but also to bigger costs in case of forecasting mistakes or inaccuracies.

 

As these outbound volumes keep on increasing, we were being faced with the need to develop a new, more accurate, more detailed and more flexible operational forecasting tool.

 

Such a forecasting tool would have to cater to the complexities of having to forecast for 17 different markets rather than a single pan European entity. Indeed, warehouse operations and customer service depend on a country level forecast to plan carriers and linguistic staff. This is a very unique situation where on top of having a rapidly growing business we have to take into account local marketing events and markets specificities.

 

Finally, given the importance of ensuring consumer satisfaction through timely delivery of their orders, we also decided to provide a daily forecast for all 17 markets rather than the usual weekly format. Such a level of details improves the warehouse's shipping speed but also increase once again the difficulty of our task.

 

Describe the working solution

 

Our first challenge was to find reliable sources of information. Both business analytics (financial and historical sales data) and web analytics (traffic information) data were already available to us through SAP HANA and Adobe Analytics. However, none of our databases were capturing in a single place all information related to marketing campaigns, project launches, events, adhoc issues, etc.

 

That is why we started by building a centralized knowledge database, which contains all past and planned events that can impact our sales and outbound volumes.

 

This tool is based on an Alteryx workflow, which cleans and blends together all the different calendars used by the other eCommerce teams. In the past, bringing those files together was a struggle since some of them are based on Excel while others are on Google Sheets, moreover, all are using a different format.

 

Workflow Knowledge database.jpg

 

We made the best of this opportunity of now having a centralized event database by also developing a self-service visualization tool in Tableau, which displays all those past and future events. Such a dashboard is now used to:

 

  1. Give some background to our stakeholders about what is driving the volumes seen in the forecast.
  2. Have an overview of the business during our review the sales targets of the coming weeks, etc...

 

In a second time we created a workflow, which thanks to this new centralized event database, defines for each past and upcoming days as well as for each markets a set of "genes". These genes flag potential adhoc issues, commercial activations, level of discount, newsletter send outs, etc.

 

This gene system can then be used to define the histoical data to be used to forecast upcoming periods, by matching future and past days that share the same or at least similar genes. This is seen as the first pillar of our forecasting model.

 

The second pillar of our forecasting tool is a file containing our European weekly targets. These targets are constantly being reviewed based on new events shown in the centralized event database and current business trends. 

An Alteryx workflow derives from this target file our sales expectation for each upcoming day, market, category (full price, clearance) and article type (inline or customized). In order to do so, we use historical data defined by our genes in addition to a set of algorithms and calculate the sales impact ratio of each market and category. These ratios are then used to allocate a target to each one of the combination.

 

worlkflow forecasting.png

 

Finally, both pillars are brought together and we derive in a final Alteryx workflow, how many orders and units will have to be placed in each markets and from which article type.

 

However, since certain periods of time have genes combinations that cannot be matched, our working solution also gives us the flexibility to manually override the results. These forecast volumes are then shared with the team, warehouse, customer service call centers, etc. through a Tableau dashboard.

 

Knowledge database.png

 

Describe the benefits you have achieved

Thanks to the work that went into developing this new forecasting model in Alteryx, the adidas WE eCommerce business ended up getting:

 

  • A more accurate forecasting model, which allows for a better planning of our operations.
  • Reduced operational costs.
  • A more detailed forecast as we can now forecast on a daily level, when past methods required much more work and limited us to a weekly forecast.
  • A flexible forecasting model that can easily be modified to include new services and sales channels.
  • A forecast dashboard that lets us easily communicate our forecast to an ever growing number of stakeholders.
  • A centralized event “calendar” that can be used by the entire department for much more than simply understanding the forecast (e.g. it is used to brief in Customer Service teams on upcoming events).
  • A massive amount of free time that can be used to drive other analyses, as it is not required from us anymore to manually join together different marketing calendars and other sources of information, create manual overviews of the upcoming weeks, manually split our weekly sales target, etc.

Authors: Irina Mihai (@irina_mihai) , Web Analyst 

                  Johannes Wagner, Senior Business Analyst

Company: Adidas International Trading B.V.

 

Awards Category: Name Your Own - Creating the New

 

Describe the problem you needed to solve 

The ecommerce business division was facing the challenge of keeping track and steering the performance of over 9000 articles.

 

Senior management had an overview of top level numbers but the actual people who could take action and steer the business on operational level had limited information.

 

Merchandizers tracked the sales of only most important product franchises which generated roughly 60% of the business, but they did not have an overview of article size availability and warehouse stock which was vital in order to know whether getting more online traffic for the article would lead to more sales or actually disappointed customers who didn't find their size. Besides stock information, merchandizers also needed BI data and web analytics data in order to have a holistic understanding of article and franchise performance, a situation which caused delays in acting upon information and steering the business proactively.

 

Even so, the full product range and especially the low-key franchises (40% of the business) were reported on an ad-hoc basis. No actions were taken on the less important franchises which led to unrealized opportunities, as unsold products are heavily discounted at the end of the season.

 

Given this complex business environment and time needed to get hold of data which even becomes obsolete before reaching the relevant stakeholders in a digestible format, we needed to give transparency on all product franchises and provide all the relevant information needed to take actions and drive the business on both aggregated and granular level, in real time, in one place, available to everyone, in an automated way.

 

To sum up, the drivers that led to a new way of working within analytics were:

 

  • Tracking ongoing performance on all articles improves our margin so that we can drive sales during the season and avoid heavy discounting at the end of the season. Offering too many discounts also has a negative long-term impact on the brand and educates consumers to buy on discount, so we wanted to make sure we maximize opportunities within season.
  • Besides immediate financial returns, we are also thinking of the consumer experience and the fact that not finding their desired sizes online disappoints customers. Being able to drive demand planning proactively and ensure enough supply is available is a way to keep customers happy and returning to our site.

 

Describe the working solution

Alteryx has allowed us to tap into multiple sources of data in a fast, scalable way not possible before, which allows us to be truly agile and data driven as an organization.

 

On a high level, the data sources used in the workflow are:

  • BI data incl. sales data and standard margin per article per day
  • Waiting List data from the CRM system indicating the number of times an out of stock product was placed on the waiting list
  • Article master data from the range management application
  • Demand planning master data with the estimated bought quantity per size which defines the relative importance of each size of an article
  • Web analytics data for product views and conversion rate  
  • Stock quantity data from the online platform with the daily stock snapshot on size level
  • Product range files manually maintained for retail intro date,  marketing campaign information, and original sales forecast quantity per month

 

  1. There are 3 work streams used in the main workflow:
    1.1 Calculation of daily sales forecasts per article number based on the product range files and master data file.

Several operations are done to clean up the data but the most important part is transforming the monthly forecast to a daily level also taking into account the retail intro date. For example if an article has a retail intro date in the middle of the month, we only generate a forecast for the days after that date and not before, to maintain accuracy.

 

Picture1.png

 

1.2 Data cleanse operations done on web analytics and BI data and subsequent join on article and day level

 

For each data type we have created a historical Alteryx database that gets unioned with new cleansed data, which then gets written into the historical database.

 

Picture2.png

 

1.3 Join of the daily sales forecast with the web analytics data, BI data and wishlist data on article and day level

Picture3.png

 

Here we also calculate the actual retail intro date for each article based on the first day when the product gets online traffic, thus allowing us visibility on products that were launched late.

 

  1. In a second workflow we calculate the stock availability per article size and size and buy availability per article. This is based on the master data file indicating the buy percentage per size and article and stock snapshot indicating the size availability per article. The output is a Tableau data extract.

Picture4.png

 

The outputs of the two workflows are then visualized in a Tableau dashboard that has a flow-like structure allowing users to see performance of the product franchises on high level and also drill down into details on article level:

 

Picture1.png

 

Picture2.png

 

Picture3.png

 

Picture4.png

 

 

Describe the benefits you have achieved

First of all, without Alteryx the Trading Dashboard would not have been possible due to the sheer amount of data sitting in different systems and manual work involved in retrieving and combining it at the same level of granularity.

Alteryx has allowed us the possibility to blend a variety of data sources in a scalable way and achieve the following business benefits:

 

  • In terms of time savings, prior to using Alteryx, two full time employees would have been needed to compile an in-season daily snapshot of the most important product franchises (60% of the business) with all the relevant metrics. By the time this report reached stakeholders, the information would have been obsolete and irrelevant to quickly react to consumer behavior in real time. Now with the help of Alteryx it takes 10 minutes per day for the analytics team to provide a holistic dashboard to both senior management and employees who can take quick decisions and steer the business based on real-time data.
  • Increased revenue and margin optimization: Our merchandisers and category managers now have a daily complete overview of how each and every single article is performing. Due to the exploratory and intuitive nature of the dashboard (from top level to detailed article level and coloring based on forecast achievement) they can easily identify which product franchises and individual products are falling behind sales forecast and what specific levers to pull in order to increase sales. Example actions are driving more traffic, improving on-site merchandising, restocking particular sizes, decreasing the price.
  • Customer satisfaction: as sizes are restocked faster than before due to the new proactive way of working of the demand planning department, consumers are also happier that they can purchase their desired sizes. This leads to more customers returning to our site because they know here they can find sizes that are not available in retail stores.

 

We have recently introduced the Trading Dashboard and there is already a mindset shift happening where different departments work more closely together to identify opportunities and act based on the data. We believe Alteryx has enabled us to reach our ambitious growth targets, improve customer satisfaction and operate as a data driven organization.

 

Suzanne.pngAuthor: Suzanne McCartin (@SMCCARTI) , Sr. Ops Reporting Analyst

Company: Nike, Inc.

 

Awards Category: Name Your Own - Get Back In Time

 

Describe the problem you needed to solve 

My two my personal favorite Nike values are 'Simplify and Go' and 'Evolve Immediately'!    In Nike Apparel Global Product Creation Operations.  Our immediate need was to replace a critical and core data set on a tight timeline.  Making sure our product creation centers didn't lose buy tracking visibility.     Buy readiness is the measure and metric for garment commercialization.  Do we have everything we need to purchase?   This was just the beginning...

 

Describe the working solution

The buy ready metric process was implemented using a combination of tools and the first step was to replace the one data source, adding Alteryx to the tool mix.  The build process was then reconstructed and migrated to Altetyx using blending and in-database tools.  Going from about a 5 hour process to 1 hour.

 

The next follow up solution was to upgrade the report generation processes.  The first solution was one process for each output, and each one had its own data collection process.  Each of these solutions was moved to one workflow using the same data collection process.   Allowing me to enforce Nike's single version of the truth mantra!  This solution  has all kinds of data cleaning , mapping, and shaping.

 

Describe the benefits you have achieved

The first round benefit was getting the upgrade done and we did so with improved accuracy and data visibility.    The real benefit was to allow the process to get us back to the future and we are lined up to better collaborate with IT and move to Tableau and other new platforms!

Author: Kiran Ramakrishnan

 

Awards Category: Most Time Saved 

 

Through automating processes we received a lot of management attention and a desire to create more automated and on-demand analysis, dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

 

BBB_definitions.PNG

 

Describe the problem you needed to solve 

We are a semiconductor company located in the Silicon Valley. We are in business for more than 30 years with 45 locations globally and about 5000 employees. We are in business to solve our customers' challenges. We are a leader in driving innovations in particular for Microcontrollers. The company focuses on markets embedded processing, security, wireless, and touch technologies. In Automotive we provide solutions beyond touch such Remote keyless or networking. Our emphasize is IoT applications. We see a potential in the Internet of Things market combining our products especially MCUs, Security and Wireless Technologies.

 

In this industry, planning is essential as the market is very dynamic and volatile but manufacturing cycles are long. Most electronic applications have comparatively short product life cycles and sharp production ramp cycles. Ignoring these ramps could result in over/under capacity. For a semiconductor company it is key to clearly understand these dynamics and take appropriate actions within an acceptable time.

 

To forecast and make appropriate predictions, organizations need critical information such as actual forecast, billing, backlog and bookings. Based on this information Sales, BUs and Finance are able to build models. As End of Life parts convert immediately into revenue we need to treat them separately. Typically semiconductors sales is based on sales commission. Sales commissions are calculated on product category and type. Therefore each line item needs to be matched to a salesperson by product life cycle. In public companies this is done on a quarterly basis and regular updates increase an organization's confidence to achieve set goals. As electronic companies are demanding more and more security levels to data access, consolidated dataset needs to be protected to ensure compliance with customer agreements. Large organizations also require data security to ensure data is only accessible on a need-to-know basis.

 

user_guide.gif

Historically, people from these different groups manually created, cleansed and merged data and information into various files and sources to achieve insight. It is common to use different environments such as Oracle DBs, SAP, ModelN, SharePoint, Salesforce, Excel, and Access. This is extremely time consuming and requires a huge manual effort. Usually data consistency between different sources is not guaranteed and requires additional cleansing and manipulation. As every person/group has also their own way to gathering and consolidating this information it typically leads to different results and layout as it is hard for someone outside the group to clearly understand the other person's approach. These reports are regularly necessary a necessity and need to be complied on a weekly/daily basis on the refresh frequencies.  We also want to get independent of resources to update dashboards on demand. Current process makes the reporting heavily reliant on human resources.

 

Describe the working solution

In Alteryx we found the solution to our problem. Alteryx was utilized to join data sources of in different data formats and environments gathered from different departments including Sales, Finance, Operations/Supply Chain, and Human Resources.

 

  • The Sales department provides Forecast in an Excel worksheet. As the worksheet is being accessed and edited by more than 500 individuals, data inconsistency between fields (such as time dimension) is an ongoing issue and data architecture needs to be re-organized and consolidated.
  • The Finance department provides Billings in the format of Oracle Hyperion, where there are data inconsistencies between Billings and Backlog & Bookings due to system differences. Billings need to be merged with Backlog & Bookings to identify EOL parts for commissions and forecast are identified.
  • The Operations/Supply Chain department provides Backlog & Bookings through SAP, which also has data inconsistencies between Backlog & Bookings and Billings due to system differences. Backlog and Bookings need to be merged with Billings, and EOL parts for commission and forecast are identified.
  • The HR department provides Organization Hierarchy through SAP HANA, in order to apply a row level security on the dashboards later on.

 

To resolve the issues, all relevant data is structured and follows the overall defined data architecture described in Alteryx. First, Alteryx pulls relevant data from various sources and stores it in a shared drive/folder. Then, Alteryx runs its algorithms based on our definitions. A special script was developed to publish and trigger a refresh of the dashboard with the latest data on a daily basis. Finally, a notification via email is sent to all the users (more than 500) with a hyperlink, once the refreshed data is published.

 

Workflow.png

 

Describe the benefits you have achieved

Prior to the Alteryx implementation, a lot of time was spent downloading, storing, and consolidating the files, which resulted in multiple unexpected errors which were hard to identify. The accuracy and confidence level of the manually created dashboard was not very high, due to the unexpected human errors. Very often, the dashboards required so much preparation that by the time they were published they were already outdated.

 

Through the Alteryx approach, we have now eliminated manual intervention and reduced the effort to prepare and publish/distribute the reports to less than 1% compared to previous approach. In addition, through this streamlined approach we have stimulated collaboration on a global basis.

 

Departments such as IT, Finance, Sales are able to work much tighter together as they are seeing results within an extremely short period of time.

The other advantage of this solution is that it is now broadly being used throughout the organization from the CEO to analysts based on the defined security model.

 

Running_Time.pngHow much time has your organization saved by using Alteryx workflows?

It used to take us one week to create and develop the workflow. The biggest challenge we faced was to determine the individual steps and the responsible person as various resources and departments were required to contribute.

 

Through Alteryx workflow we are able to save more than 15 hours per week in data merging alone and at the same time we are now able to publish the reports/analysis on a daily basis. Through Alteryx we are now saving over 75h from various departments to run the process from end-to-end on a daily basis.

 

What has this time savings allowed you to do?

Through automating the process we received a lot of management attention and a desire to create more automated and on-demand dashboards and reports.

 

Another area where we have benefited significantly is training and process consistency. No more are we reliant on training new resources on learning the systems and process or critically affected by sudden departure of a team member.

Author: Jim Kunce, SVP & Chief Actuary

Company: MedPro Group

 

Awards Category: Best Use of Server

 

Describe the problem you needed to solve 

MedPro Group, Berkshire Hathaway's dedicated healthcare liability solution, is the nation's highest-rated healthcare liability carrier - according to A.M. Best (A++ as of 5/27/2015). We have been providing professional liability insurance to physicians, dentists and other healthcare providers since 1899. Today, we have insurance operations in all 50 states, the District of Columbia and are growing internationally. With such great size of operations and diversity of insurance products, it is a challenge to connect systems, processes and employees with one another.

 

Regardless of an insurance carrier's size and scale, its long-term success depends on:

 

  • Continued new business growth
  • Consistent pricing and risk-evaluation
  • Unified internal operations

Our challenge was to lay the analytical foundation necessary for an ever-growing insurance company to execute on these three objectives. We identified the following three action items and linked them to the drivers of long-term success.

 

  • Fuel new business growth by:  Centralizing processes & remove system silos, link manual processes together.
  • Drive consistent pricing and risk-evaluation: Remove data supply bottle-necks & empower business analysts to self-serve.
  • Unify internal operations: Accelerate modernization & facilitate enterprise-wide legacy system integration.

 

Describe the working solution

"Fuel new business growth by centralizing processes & remove system silos, link manual processes together."

 

This solution has three parts to it.

  • First, we programmed our pricing algorithm using Alteryx to "learn" the insurability of a prospective customer.
  • Second, we overlaid this system on our CRM data to create sales recommendations nationwide.
  • Third, we deployed this recommender system with our Alteryx private gallery to provide real-time access to our sales teams.

MPG Private Gallery Snapshot.jpeg

 

Today, from anywhere in the country, our sales personnel can request a report for a customer they are prospecting and receive a consistent, reliable recommendation in a matter of seconds with little manual intervention.

 

"Drive consistent pricing and risk-evaluation:  Remove data supply bottle-necks & empower business analysts to self-serve."

 

In an insurance company, actuaries and underwriters are responsible for pricing insurance policies and evaluating insurance risks of applicants. These complex decisions rely on many data inputs - some of which are internally available, but in other cases come from external sources (e.g. government websites, third party resources).

 

Today, we have been able to significantly reduce the data supply bottle-necks by configuring the Alteryx server to be the bridge between the data sources and our actuaries and underwriters. Each person along the pricing and risk evaluation process now gets “analysis-ready” data consistently and timely from the private gallery, a virtual buffet of self-serve apps for all data needs.

 

"Unify internal operations: accelerate modernization -- facilitate enterprise-wide legacy system integration."

 

In 2015, MedPro Group decided to scale up investments in modernizing legacy systems to a new web-based system. The challenge was to move data from our legacy systems into the new web-based system and vice versa. Additionally, the software solution needed to have a short learning curve and be flexible and transparent enough that key business leaders managing this modernization would be able to perform the data migration tasks.

 

Alteryx was a great fit in this case. Not only were business leaders able to program processes in Alteryx in a relatively short timeframe, we scaled up with ease and accelerated modernization by deploying on the private server for analysts to use in a self-serve, reliable environment.

 

Describe the benefits you have achieved

"We have connected systems, processes and employees to one another and made the benefits of that interconnectivity available to every employee."

 

We have been using the private gallery and server since July, 2015. What started as a proof of concept and experiment is now a fully functional production-grade experience. The list of systems that have been connected, processes that have been automated and employees who are finding value out of our private gallery and server is growing rapidly.

 

Here's a view into some of the measurable benefits we have achieved in just nine months -

  • 94: The number of apps published to the private gallery to date.
  • 6951: The number of times an app has run on the gallery. That's 26 runs a day over 9 months!
  • 15: The percentage of employees who are served with this consistent, reliable self-serve platform.

 

And our goal? Move that needle to 100% with Alteryx in the months to come!

andy_moncla_avatar.pngAuthor: Andy Moncla ( @AndyMoncla ), Chief Operating Officer & Alteryx ACE In-2CRev-28px-R.png

Company: B.I. Spatial

 

Awards Category:  Best Use of Spatial

With Spatial in our company name we use Spatial analytics every day.  We use Spatial analytics to better understand consumer behavior, especially relative to the retail stores, restaurants and banks they use.  We are avid proponents and users of customer segmentation.  We rely on Experian's Mosaic within ConsumerView.  In the last 2 years we have invested heavily in understanding the appropriate use of Mobile Device Location data.  We help our clients use the mobile data for better understanding their customers as well as their competitors' customers and trade areas.

 

Describe the problem you needed to solve 

Among retail, restaurant and financial services location analysts, one of the hottest topics is using mobile device location data as a surrogate for customer intercept studies. The beauty of this data, when used properly, is that it provides incredible insight. We can define home and work trade areas, differentiate between a shopping center’s trade areas versus its anchors, understand shopping preferences, identify positive co-tenancies, and, perform customer segmentation studies. 

 

The problem, or opportunity, we wanted to solve was to: 

1. Develop a process that would allow us to clean/analyze each mobile device’s spatial data in order to determine its most probable home location 

2. Build a new, programmatic trade area methodology that would best represent the mall/shopping center visitors’ distribution 

3. Easily deliver the trade areas and their demographic attributes 

 

And, it had to scale. You see, our company entered into a partnership with UberMedia and the Directory of Major Malls to develop residence-based trade areas for every mall and shopping center in the United States and Canada – about 8,000 locations. We needed to get from 100 billion rows of raw data to 8,000 trade areas. 

 

Describe the working solution

Before I get into the details I’d like to thank Alteryx for bringing Paul DePodesta back as a Keynote Speaker this year at Inspire. Paul spoke at a previous Inspire and his advice to keep a journal was critical to the success of this project. I actually kept track of CPU and Memory usage as I was doing my best to be the most efficient. Thanks for the advice Paul. 

 

journal.png

 

Using only Alteryx Spatial, we were able to accomplish our goal. Without giving away the secret sauce, here’s what we did. We divided the task into three parts which I will describe below. 

 

1.  Data Hygiene and Analysis (8 workflows for each state and province) – The goal of this portion was to identify the most likely home location for each unique device. It is important to note that the raw data is fraught with bad data, including common device identifiers, false location data and location points that could not be a home location. To clean the data, nearly all of the 100 billion rows of data were touched dozens of times. Here are some of the details.

a. Common Device Identifiers

i. The Summarize tool was used to determine those device ID’s, which were then used within a Filter tool 

ii. Devices with improper lengths were also removed using the Filter tool 

b. False Location Data – every now and again there is a lat/long that has an inexplicably high number of devices (think tens or hundreds of thousands). These points were eliminated using algorithms utilizing the Create Points, Summarization and Formula tools, coupled with spatial filtering.

c. Couldn’t be a Home Location – For a point to be considered as a likely home location, it had to be within a populated Census Block and not within other spatial features. We downloaded the Census Blocks from the Census and, utilizing the TomTom data included within Alteryx Spatial, built a series of spatial filter files for each US state and Canadian province. To build the spatial filters (one macro with 60+ tools), we used the following spatial tools:

i. Create Points 

ii. Trade Area 

iii. Buffer 

iv. Spatial Match 

v. Distance 

vi. Spatial Process Cut 

vii. Summarize - SpatialObj Combine 

 

Once the filters were built all of the data was passed through the filters, yielding only those points that could possibly be a home location. 

 

Typically, there are over one thousand observations per device, so even after the filtering there was work left to be done. We built a series of workflows that took advantage of the Calgary tools so that we could analyze each device, individually. Since every device record was timestamped, our workflows were able to identify clusters of activity over time and calculate the most likely home location. Tools critical to this process included: 

  • Sort 
  • Tile 
  • Multi-row Formula 
  • Calgary Join and Input 
  • Formula 
  • Create Points 
  • Trade Area 
  • Distance 

The Hygiene portion of this process reduced 100 billion rows of raw data to about 45 million likely home locations. 

 

2.   Trade Area Delineation (4 workflows/macros for each mall and shopping center, run iteratively until capture rate was achieved) – We didn’t want to manually delineate thousands of trade areas. We did want a consistent, programmatic methodology that could be run within Alteryx. In short, we wanted the trade area method to produce polygons that depicted concentrations of visitors without including areas that didn’t contribute. We also didn’t want to predefine the extent of the trade areas; i.e. 20 minutes. We wanted the data to drive the result. This is what we did.

a. Devised a Nearest Neighbor Methodology and embedded it within a Trade Area Macro – Creates a trade area based on each visitor’s proximity to other visitors. Tools used in this Macro include:

i. Calgary 

ii. Calgary Join 

iii. Distance 

iv. Sort 

v. Running Total 

vi. Filter 

vii. Find Nearest 

viii. Tile 

ix. Summarize – SpatialObj Combine 

x. Poly-Split 

xi. Buffer 

xii. Smooth 

xiii. Spatial Match 

 

b. Nest the Trade Area Macro within an Iterative Macro – By placing the Trade Area Macro within the Iterative Macro Alteryx allow the Trade Area Macro to run multiple scenarios until the trade area capture rate is achieved 

c. Nest the Iterative Macro within a Batch Macro – Nesting the Iterative Macro within the Batch Macro allows us to run an entire state at once 

 

The resultant trade areas do a great job of depicting where the visitors live. Although rings and drive times are great tools, especially when considering new sites, trade areas based on behavior are superior. For the shopping center below, a ring would have included areas with low visitor concentrations, but high populations. 

 

trade area with ring.png

 

3.  Trade Area Attributed Collection and Preparation (15 workflows) – Not everyone in business has mapping software but many are using Tableau. We decided that we could broaden our audience if we’d simply make our trade areas available within Tableau. 

 

Using Alteryx, we were able to easily export our trade areas for Tableau. 

Tableau - trade area.png

 

Build Zip Code maps. 

 

Tableau - zip code contribution.png

 

For our clients that use Experian’s Mosaic or PopStats demographics, Alteryx allows us to attach the trade area attributes. 

Tableau - mosaic bubbles.png

Tableau - PopStats.png

 

Describe the benefits you have achieved

The benefits we have achieved are incredible. 

 

The impact to our business is that both our client list and industry coverage have more than doubled without having to add headcount. By year end, we expect our clients’ combined annual sales to top $250 billion. Our own revenues are on pace to triple. 

 

Our clients are abandoning older customer intercept methods and depending on us. 

 

Operationally, we have repeatable processes that are lightning fast. We can now produce a store or shopping center’s trade area in minutes. Our new trade methodology has been very well received and requested. 

 

Personally, Alteryx has allowed me to harness my nearly 30 years of spatial experience and create repeatable processes and to continually learn and get better. It’s fun to be peaking almost 30 years into my career. 

 

Since we have gone to market with the retail trade area product we have heard “beautiful”, “brilliant” and “makes perfect sense.” Everyone loves a pat on the back, but, what we really like hearing is “So, what’s Alteryx?” and “Can we get pricing?” 

Author: Jeffrey Jones (@JeffreyJones), Chief Analytics Officer  In-2CRev-28px-R.png

Company: Bristlecone Holdings

 

Awards Category:  Name Your Own - Most Entertaining (but Super-Practical) Use of Alteryx

 

Describe the problem you needed to solve 

Our marketing department needed a working Sex Machine, but that sort of thing was strictly prohibited in our technology stack.

 

Describe the working solution

Analytics built a functional Sex Machine! Let me explain...

 

Because our business involves consumer lending, we absolutely cannot -- no way no how -- make any kind of decisioning based on sex or gender. Regulators don't want you discriminating based on that and so we don't even bother to ask about it in our online application nor do we store anything related to sex in our database. Sex is taboo when it comes to the Equal Opportunity Credit Act. But the problem was that the marketing department needed better insight into our customer demographics so that they could adjust their campaigns and the messaging on our website, videos, etc., based on actual data instead of gut instinct.

 

Well, it turns out the Census Bureau publishes awesome (and clean) data on baby names and their sex. So we made a quick little workflow to import and join 134 years of births in the U.S. resulting in over 1.8 million different name/sex/year combinations. We counted the occurrences, looked at the ratio of M to F births for each and made some (fairly good) generalizations about whether a name was more likely a "Male" name or "Female" name. Some were pretty obvious, like "John." Others were less obvious, like "Jo." And some were totally indeterminate, like "Jahni."

 

Then we joined this brand new data set to an export of our 200k customer applications and were able to determine the sex of around 90% our applicants fairly reliably, another 7% with less reliability, and only 3% as completely unknown. The best thing about it is that we were able to answer these questions completely outside our lending technology stack in a manner disconnected from our decisioning engine so as to maintain legal compliance. We also didn't have to waste any money or time on conducting additional customer surveys.

 

This was literally something that was conceived in the middle of the night and had been born into production before lunch on the following day. (bow-chicka-bow-bow) Doing this wouldn't have been just impossible before Alteryx, it would have been LAUGHABLY IMPOSSIBLE. Especially given the size of the third-party data we needed to leverage and the nature of our tech stack and the way regulation works in consumer lending.

 

Describe the benefits you have achieved

It sounds silly, but our organization did realize tangible benefit from doing this. Before, we had no idea about a critical demographic component for our customers. It's impossible to look a bank of nearly 200k names across four totally unrelated industry verticals and conclude with any kind of confidence sex-related trends. Now we can understand sex-related trends in the furniture, bridal, pet, and auto industries. We can link it to the products they're actually getting and tweak the messaging on our website accordingly. And what's more, we're able to do all this in real-time going forward without wasting any of our DBAs' time or distracting our legal department. This probably saved us a hundred man-hours or more given all the parties that would have needed to get involved to answer this simple demographic question.

 

We should probably tidy up this workflow and the .yxdb because it might be useful for other companies who want to get a full demographic breakdown but don't have any pre-existing information on customer sex. If anybody wants to know the total number of people born with every name for the last 134 years and needs the M:F occurrence ratio for each, holler at me.

Author: Francisco Aristiguieta, Audit Specialist

 

Awards Category: Name Your Own - Best Engagement From Management

 

Describe the problem you needed to solve 

With operations in all time-zones and more than 10,000 people, my company needed an effective way to ensure we don't have rouge employees exposing us to corruption.

 

Before our Alteryx tool, we had a very complete compliance program focused on prevention; but we did not had a viable method to verify the mandates were understood and followed across the globe.

 

Describe the working solution

Our plan was to every month inspect every payment the company had done for signs of potential problems. We would do this by searching each invoice line for keywords that could represent problems.

 

The plan was simple, although the implementation would have been an enormous problem if we had not had Alteryx.  Here are a few of the (multiple) humps Alteryx helped us address:

 

1. Payment information was broken in multiple tables. Even if we would be working with Oracle data, our IT department insisted that we worked with off-line copies of the tables instead of connecting directly. This made our data source a series of multiple monthly csv tables, where the tables had no meaning on their own.

 

>> Importing all files in a folder, and using "Unique", "Filter", "Select" and "Join"; allowed me to conquest this first challenge.

 

2. I used "find replace" to do the keyword searches; which was a great step forward. Sadly, in many cases our chosen "keywords" were part of innocent words, which caused a plague of false positives for follow-up. i.e. the word "magenta" would be caught when we searched for "agent". 

 

>> Using "Formula" to set-up some "If-Then-Else statements", and carefully using "and" to set-up my conditions, I was able to safe list some of these innocent words and get rid of a large portion of these false positives.

 

3. Because the outputs of each run is stored separately, my last big challenge was making sure I didn't report/investigate the same transactions month after month as we re-ran the tests.

 

>> Solving this was easy through a collection of file imports, "union", and "join" to compare the current results to the recent past (keeping only new hits) in my analysis.

 

Francisco_FCPATEsts.jpeg 

Describe the benefits you have achieved

Even if (after follow-up) the tests have not found any real problems, we are very happy to finally have peace of mind regarding how our employees are behaving across the world. This test was a great way to demonstrate the value of analytics to the more traditional pockets of our company, and its results have been greatly celebrated, giving me and my team some great exposure to the highest levels of my organization. Here are a few quotes from our clients:

 

  • "This is another SUCCESS for the Data Analytics initiative.  There is NO WAY we would have ever even known this was an issue without this capability "
  • "I believe that this proactive approach is clearly one of the most significant advances in early detection techniques that (the team) has implemented in quite a while"
  • "The mere fact that the word will get out that we have tools like this to potentially catch such payments should be a powerful deterrent"
  • "Our analytics practices have changed the way we (work) increasing our effectiveness and efficiency"
  • "I am looking forward  to work on another (analytics) initiative with (the team)"

ksnow.pngAuthor: Keith Snow (@ksnow), President/Data Scientist 

Company: B2E Direct Marketing Twitter_logo_blue.png  In-2CRev-28px-R.png fb-art.jpg

 

Awards Category: Best 'Alteryx For Good' Story

On December 1st, 2015, which was "Giving Tuesday", a global day dedicated to giving back, B2E Direct Marketing announced a newly created grant program for 2016 called 'Big Data for Non-Profits'.  B2E Direct Marketing is a business offering Big Data, Visual Business Intelligence and Database Marketing solutions.

 

Non-profit organizations are a crucial part of our society, providing help to the needy, education for a lifetime, social interactions and funds for good causes.

 

Describe the problem you needed to solve 

While serving on three non-profit boards, Keith Snow, President of B2E, became aware that data is among the most important, under-used and least maintained asset of a non-profit. 

 

B2E_Volunteer.png"The 'Big Data for Non-Profits' Grant program was born out of a vision that we had at B2E to give back to our community. We wanted to offer non-profits the same visual business intelligence and database marketing services that we offer our other clients." says Snow.

 

The grant program includes the following services free of charge to the winning organization in the month for which they are selected:

  • Data Hygiene (clean up donor file)
  • Data Append (age, income, gender, marital status, lifestyle segmentation, and more)
  • Detailed donor analysis and overview reports

 

Each month in 2016, B2E will choose one non-profit from those that apply through www.nonprofit360marketing.com. Award recipient applications are reviewed by a panel selected by B2E and awards are given based upon how the services will be used and to further the organization's goals. The grant program began accepting applications from eligible 501(c)(3) non-profits at the end of December and has already completed work on three organizations so far this year.

 

"We are excited about using Alteryx to help non-profits expand their mission and to better serve our communities." says Snow.

 

Describe the working solution

B2E has an initial consultation meeting with each non-profit where the goals and takeaways of the 'Big Data for Non-Profits' program is discussed.

 

We identify current data sources that the non-profit has available, and request up to 48 months of donor contact and giving information.  Minimal information is requested from the non-profit as we know great value can be added using Alteryx Designer.

  • Name                                                                         
  • Address, City, State, Zip
  • Phone
  • Date of Donation
  • Amount of Donation
  • Campaign
  • B2E_Alteryx.pngDonation type: i.e. cash, check, soft credit, etc. 


B2E has created Alteryx workflows to perform donor file hygiene. Since we have licensed the data package, we take advantage of the CASS, Zip4 Coder and Experian Geodemographic append and TomTom capabilities.

 

All donor data is address standardized to meet postal standards and duplicates within their database are identified.  Once the data is updated to meet our standards, we process the files against the National Change of Address and the National Deceased database. 

 

The next step is taking the donor's contact information and appending demographics at the individual and household level (age, income, gender, marital, age of home, Mosaic segmentation, etc.) using the Alteryx Experian add-on product.  Alteryx Designer is invaluable for this process as we manipulate the donor data to be more useful for the non-profit.

 

B2E_DonationTableau.pngAlteryx' ability to export Tableau Extract files are key for this program to be successful. We have created key Tableau dashboards that highlight the following:

a. Consumer demographics

b. Mosaic marketing segmentation

c. Campaign or donation source

d. Donation seasonality / giving analysis

e. Pareto (80/20 Rule): to identify and profile the 20% of the donors who contribute 80% of the revenue

f. Geography (city, zip, county, metro area)

 

Once the data is in the Tableau Extract, business intelligence analysis is performed with visualization that is easy to understand and immediately actionable by the non-profit.  Tableau packaged workbooks are created for each non-profit so they have access to interactive analytics to help them make quick and immediate business decisions for their organization.

 

Describe the benefits you have achieved

B2E provides a niche service that many non-profits do not have the knowledge, tools or budget to complete on their own.

 

The benefits to each non-profit includes the following:

  1. The donor data from each non-profit can now be processed in days instead of weeks using Alteryx. This allows B2E the maximum ability to help more organizations. In the past, we only worked with one non-profit per year. Our 2016 goal is to work with twelve.
  2. A clean donor contact file with updated addresses, deceased individuals flag and duplicated merged is returned to the organization. Many non-profits send out direct mail, they immediately see their deliverability rates increase by more than 15% and return mail rates decrease. The cost for printing and postage is optimized as well.
  3. The best way to get your current donors to give more is to truly understand what they look like. Understand the donor's life stage, giving history, demographics, lifestyle characteristics, media preferences and digital behavior is key for success. Targeting a donor in a way that resonates with them has lead to an increase in giving. 
  4. All non-profits want access to new donors. A profile identifies what the best donor characteristics look like. Since B2E can also acquire direct mail and email lists, we help the non-profit find "look-alike" individuals who have never donated to their organization.
  5. B2E's goal is to help each non-profit to maximize the current donations coming into their organization so they can keep their expenses and overhead lower as well as offer them a free service they would not have otherwise acquired.

 

The impact to each non-profit is huge, but the impact to B2E is just as great as we are allowed to use a great tool to be a leader in Iowa as a company that truly gives back to our community all year long. As of April, 2016, we have provided services for:

  • Big Brothers Big Sisters of Iowa
  • Children's Cancer Connection
  • Youth Emergency Services and Shelter of Iowa
  • Governors District Alliance
  • Easter Seals Iowa

Author:  Brodie Ruttan  (@BrodieR), Lead Analytics & Special Projects In-2CRev-28px-R.png

Company: Downer New Zealand

 

Awards Category: Name Your Own - Best Use of Alteryx SharePoint Integration

 

Describe the problem you needed to solve

I work for the largest services company in New Zealand, Downer NZ Ltd. Water services, Telecommunications, Power, Gas, Mining, Roads, Rail, Airports, Marine, and Defense etc. Our Work Streams are business to business and business to government and as such there are many different, disparate, aged data sources to work with. While we are progressing work streams on to new platforms, many of the databases and information systems we use are very dated and to keep developing them is cost prohibitive.

To keep providing our customers with the increased level of service they desire we need to keep capturing new metrics, but can't spend the money to further develop aged systems. How can we implement a solution to capture these new metrics without additional costs, and can we use the learning provided from capturing this data to develop the new information systems to operate these work streams?

 

Describe the working solution

What we have implemented at Downer is a solution whereby we develop SharePoint lists to sit alongside our current information systems to gather supplementary data about the work we do and seamlessly report on it. An example of this would be if one of our technicians is at site a Cell Mast Site (think cell/mobile phone transmitting tower) and needs to report that the work cannot be completed, but the site has been "Made Safe." "Made Safe" is not a Boolean expression available in our current information systems. This is where Alteryx comes in and provides the value. Alteryx is capable of pulling the data out of the aged system and pushing the required job details into SharePoint. Once data has been added to the SharePoint list, Alteryx can then blend the data seamlessly back into exports for reporting and monitoring purposes.

Brodie_Screenshot_Workflow.png

Describe the benefits you have achieved

Our business now has the capability of expanding legacy systems seamlessly using Alteryx and SharePoint. The cost of implementing the solution is limited only to the licensing costs of Alteryx and a SharePoint environment. Considering both of these licensing costs are sunk, we are capable of expanding systems using only the cost of time, which when using Alteryx and SharePoint is minimal. The cost benefit is immense, to upgrade or expand a legacy information system is a hugely expensive effort with little benefit to show. Legacy information systems in our environment mostly need to be migrated rather than upgraded. While we build these lists to expand our capability and keep our customers satisfied we also get the benefits of lessons learned when developing the new platform. Any information gathered in SharePoint, using Alteryx, needs to be planned for when the new information system is stood up, which saves the effort and cost of additional business analyst work.

 

We have also expanded this capability using Alteryx to pull out multi-faceted work projects for display in Gantt views in SharePoint and then to pull the updated information back into the host systems.

 

Brodie_Screenshot.png

DSC_0035.JPGAuthor: Erik Miller (@erik_miller), Sr Systems Engineer - Cyber Security Analytics

 

Awards Category: Most Time Saved

 

Describe the problem you needed to solve

My team's story starts from the ground level of analytics: no tools, no resources, no defined data sources. But our Information Security team had an idea: to be able to report out on all of Western Union's Agent Locations (think Kroger grocery stores, mom & pop shops, etc) and the risk they posed by not having certain security measures implemented - look at every PC/terminal they have to determine their individual risks (2.4 million when we started), their fraud history, their transaction limits, etc, etc. and risk-rate every one of those 500,000+ Locations. We completed a proof of concept and realized it was completely unsustainable, requiring over 100+ hours every month to be able to produce, what outwardly looked like, a simple report. We took that process and built it out in Alteryx. And with just a little over 2.5 hours with the tool, we took a process which dominated my time and turned it into a 5 ½ minute layout of time. What's more, we've turned this POC project and turned it into a full-fledged program and department, focused on risk analytics surrounding employee & contractor resource usage (malicious or uneducated insiders), customer web analytics (looking for hackers), and further Agent analytics.

 

Beyond our humble beginnings, there's the constant threat of data breaches, fraud, and malicious insiders in the Information Security world - it's the reality of the work we do. Having the ability to build out an strategic analytics program has been a huge step in the right direction in our industry and company & not an area which many other companies have been able to focus on, which also sets us ahead of the curve.

 

Describe the working solution

We are using Alteryx to assess several data sources - HR data sets for active/terminated employees & contractors, clickstream data from our digital assets and websites, security data from our Netezza system, fraud data, log files from our various security platforms, user behavior data from our UBA (User Behavior Analytics) system, Identity and Access Management attributes/entitlements, system infection logs, installed applications, etc., etc. As I've said in other talks, we don't have a data lake, we have an ocean.

 

We are currently exporting our data to Tableau tde files, Hadoop, and MySQL databases. In addition, we have started looking/experimenting with our Alteryx Server implementation (which I support for our company).

 

Describe the benefits you have achieved

Overall time savings is nearing 150 hours a month, so a massive savings and an ability for our team to stay incredibly lean - no additional FTEs needed to keep taking on more and more data and challenges. We've also been able to give visibility to the security implementations for all of our 500,000+ worldwide locations - something which we didn't have visibility to prior to now, and which helps us drive the business to implement security features where needed - based on logic, numbers, and fraud data, not feelings.

 

We also are able to provide insights into our user base - how are our employees using our assets, what are they doing that's lowering our security posture, how are they getting infected. We're providing insights which can help our company become more secure.

 erik_miller_workflow.png

How much time has your organization saved by using Alteryx workflows?

What has this time savings allowed you to do?

With just our first workflow, we saved over 100 hours per month - so over a full FTE of time has been taken off of my plate. Alter
yx has allowed us to now only save time each month, but keep our team incredibly lean (we only have three people, and that's all we need to churn through massive amounts of security & fraud data each month).

 

So what has this time saving allowed us to do? Many, many things.

 

First, I was promoted to Sr. Systems Engineer - Cyber Security Analytics. With that change in title, also came the opportunity to build out a strategic-focused Information Security Analytics team, focused on looking at all security data throughout the company and identifying areas where we can improve our security program and posture.

 

Second, It's allowed me time to work with other departments to build out their analytics programs and help them learn to use the Alteryx tools in their respective areas.

 

Third, it's allowed my team to work on new, expanding projects with great ease.