@AdamR_AYX did a talk this year at Inspire EU about testing Alteryx Canvasses - and it seems that there is a lot we can do here to improve the product:
https://www.youtube.com/watch?v=7eN7_XQByPQ&t=1706s
One of the biggest and most impactful changes would be support for detailed unit testing for a canvas - this could work much like it does in Visual Studio:
Proposal:
In order to fully test a workflow - you need 3 things:
- Ability to replace the inputs with test data
- Ability to inspect any exceptions or errors thrown by the canvas
- Ability to compare the results to expectation
To do this:
- Create a second tab behind a canvas which is a Testing view of the canvas which allows you to define tests. Each test contains values for one or more of the inputs; expected exceptions / errors; and expected outputs
- Alteryx then needs to run each of these tests one by 1 - and for each test:
- Replace the data inputs with the defined test input.
- Check for, and trap errors generated by Alteryx
- Compare the output
- Generate a test score (pass or fail against each test case)
This would allow:
- Each workflow / canvas to carry its own test cases
- Automated regression testing overnight for every tool and canvas
Example:
For this canvas - there are 2 inputs; and one output.
Each test case would define:
- Test rows to push into input 1
- Test rows to push into input 2
- any errors we're expecting
- The expected output of the browse tool
This would make Alteryx SUPER robust and allow people to really test every canvas in an incredibly tight way!