This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Announcing Alteryx + Snowflake | Alteryx and Snowflake make analytics and data science fundamentally easier. With the new integrated starter kit, you can push down data prep transformations and more into Snowflake for faster data quality and analytics output. Learn More
We are aware of an issue with the Search bar. Please use Advanced search for the time being while we troubleshoot. Thanks for your patience as we work on improving the community!
The Engine team is responsible for verifying the quality of Alteryx products. The team uses Designer to identify when Alteryx engines' performance degrades or improves, and to report the details to stakeholders via email. The previous visualization process required manual analysis of each chart, and it was not as fast as we wanted it to be. We addressed this issue by augmenting the process with a set of R and Reporting tools in Alteryx Designer.
Describe the business challenge or problem you needed to solve
The Quality Engineering team has built a suite of workflows for each Alteryx tool to test the performance of E1 and E2, the old and new engines of Alteryx. Before, team leads used to receive daily static PDF charts showing tools’ performance, but the charts were not detailed, requiring the team to manually look for quality issues. The charts were also not intuitive, and it was difficult to extract information from them. To address this, we implemented a process that allows us to do advanced predictive analysis, identifying immediately when there is a degradation or improvement in the engines' performance. Overcoming these quality issues improves the overall performance of the Alteryx engines.
Describe your working solution
How long it takes the workflows to run is a key performance metric. An increase in the duration signals a degradation in performance, whereas a decrease in the duration signals an improvement. The output log files of one tool’s workflow serve as the data source for testing the engines' performance. We built macros to automate this process.
This is the main workflow to test the performance of each tool in different environments.
These log files are stored on the Jenkins server, and they are parsed into an Alteryx database, for example a .yxdb file.
Predictive & Statistical Analysis: This macro calculates the seven-days moving average and performs a linear regression analysis to determine if there is a linear relationship between duration and date.
Every day, team leads and the quality engineering team receive an e-mail with a report showing which tools increased or decreased in their performance.
Describe the benefits you have achieved
This process used to be manual and take hours. As we automated it, however, we saw an improvement in the engine quality. We know exactly when and what is happening, and we can easily tackle it. It also helped us learn from the tools that are performing well and apply that knowledge to other tools.
Alteryx makes me feel like the superhero Flash, and I’m very excited about the future of predictive analytics and artificial intelligence!