Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Strong support for testing

@AdamR_AYX did a talk this year at Inspire EU about testing Alteryx Canvasses - and it seems that there is a lot we can do here to improve the product:

https://www.youtube.com/watch?v=7eN7_XQByPQ&t=1706s

 

One of the biggest and most impactful changes would be support for detailed unit testing for a canvas - this could work much like it does in Visual Studio:

 

Proposal:

In order to fully test a workflow - you need 3 things:

  • Ability to replace the inputs with test data
  • Ability to inspect any exceptions or errors thrown by the canvas
  • Ability to compare the results to expectation

To do this:

  • Create a second tab behind a canvas which is a Testing view of the canvas which allows you to define tests.   Each test contains values for one or more of the inputs; expected exceptions / errors; and expected outputs
  • Alteryx then needs to run each of these tests one by 1 - and for each test:
    • Replace the data inputs with the defined test input.   
    • Check for, and trap errors generated by Alteryx
    • Compare the output
    • Generate a test score (pass or fail against each test case)

This would allow:

  • Each workflow / canvas to carry its own test cases
  • Automated regression testing overnight for every tool and canvas

 

 

Example:

 

Testing.jpg

 

For this canvas - there are 2 inputs; and one output.

Each test case would define:

  • Test rows to push into input 1
  • Test rows to push into input 2
  • any errors we're expecting
  • The expected output of the browse tool

 

 

This would make Alteryx SUPER robust and allow people to really test every canvas in an incredibly tight way!

7 Comments
SeanAdams
17 - Castor
17 - Castor

Copying @AdamR_AYX @SteveA @AshleyK @TomSt @dataMack @TuvyL @Treyson

 

This was the topic we discussed about how we could embed full regression testing into the Alteryx Engine (server and designer) which will enable robust and repeatable testing on every release; canvas; tool 

 

 

dataMack
12 - Quasar
This is a great idea that will make upgrading and migrating workflows much easier
TerryT
Alteryx Alumni (Retired)

I totally agree!  In a previous life TDD (Test-Driven Development) was required.  It took a while to appreciate the technique, but I can attest to the benefits of:

 

  1. Improved Software Reliability and Quality
  2. Improved Job Satisfaction as a Developer
  3. Improved Interfaces and Modularity (and understanding the module from the user perspective)
  4. Improved Confidence to Make Improvements ("if there are no tests, you're hacking, not refactoring")
  5. Ability to Automate Testing

 

Thanks for the great suggestion and the desire to make it a better product / experience for all!

papalow
8 - Asteroid

I like the idea.  I like the best practices included in the video. 

 

Alteryx is powerful and fast.  I find it is easy to get seduced with workflows that run without error.  Testing provides a check and balance to computing power and speed.

phillipehanson
5 - Atom
I'm still a relatively novice Alteryx user, but I've been a part of many TTD teams over the years, and have done my share of business-side 'quick win' automation and db solutions. When I first started using Alteryx, I was like a kid in a candyshop. Business-side development, reporting, ETL and automation has always been fairly limited by the lack of available tools at our disposal; rightfully so. Alteryx is changing that landscape to a degree, putting extremely powerful tools in the hands of end users. As a result, I'm always amazed when I see these workflows of incredible complexity being delivered and automated. Rapid delivery and faster decisioning. As time goes by, we end up with Galleries of massive and impressive scale performing detailed and meticulous calculations and transformations spanned across a sea of embedded steps within. Now, the obvious question is, besides literal job or workflow failure, what controls...what visibilities do we have...how do we know that everything is working as intended....that the numbers aren't skewed...that partial inputs aren't being processed....etc?? For outfits with a high degree of Alteryx maturity, I'm sure that validation and controls 'certify' the results to some degree. But, for the average kid in the candystore, the complexity and demands of maintaining an accurate and valid sea of workflows seems to be proportionally riskier over time and as things change within the business. I'd like to think that my novice Alteryx experience is leading me down a path of assumptions at this point. I hope so. But, my experience elsewhere makes me nervous. Regression testing is a bear to setup, implement and maintain. But with it, at least one can sleep well. Without it, or some form of it at least, I am wondering how anyone can answer the basic question "This is all great stuff, but how do you know it's working correctly and accurately???" 🙂
AlteryxCommunityTeam
Alteryx Community Team
Alteryx Community Team
Status changed to: Accepting Votes
 
TheOC
15 - Aurora
15 - Aurora

It would be great to see movement towards testing in any form with Alteryx Designer. 

@phillipehanson articulates fantastic points about the need for testing and reliance on test-driven development within the business. 

 

Similar can now be set up with Server and v3 orchestration workflows, but I do not feel this comfortably fills this gap, and more support is needed for testing. 

Great suggestion!