This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
One of the biggest and most impactful changes would be support for detailed unit testing for a canvas - this could work much like it does in Visual Studio:
In order to fully test a workflow - you need 3 things:
Ability to replace the inputs with test data
Ability to inspect any exceptions or errors thrown by the canvas
Ability to compare the results to expectation
To do this:
Create a second tab behind a canvas which is a Testing view of the canvas which allows you to define tests. Each test contains values for one or more of the inputs; expected exceptions / errors; and expected outputs
Alteryx then needs to run each of these tests one by 1 - and for each test:
Replace the data inputs with the defined test input.
Check for, and trap errors generated by Alteryx
Compare the output
Generate a test score (pass or fail against each test case)
This would allow:
Each workflow / canvas to carry its own test cases
Automated regression testing overnight for every tool and canvas
For this canvas - there are 2 inputs; and one output.
Each test case would define:
Test rows to push into input 1
Test rows to push into input 2
any errors we're expecting
The expected output of the browse tool
This would make Alteryx SUPER robust and allow people to really test every canvas in an incredibly tight way!
You've built your Alteryx workflow to process your data, but are you certain it works how you intended it to? Can you be certain it will work months from now when the data format changes? In this session, we'll look at different automated ways to test your workflow, starting with features built ...