This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
What best practices do you use during your peer review process? Do you use a checklist? Our admin is about to change jobs, and since the rest of us are just starting our Alteryx journey (we are participating in the Alteryx challenges as a team) - it would be great to know how others are handling peer review, what efficiencies tips are recommended. Generally what we should be looking for and get this implemented into our team.
I'm going to start by saying that the answer(s) that you get to this post will vary based upon the controls needed in your organization. A checklist would be suggested to ensure that the review process addresses all control requirements. I'm going to answer this post based upon my personal as well as my professional experiences. So take it all as a friendly suggestion and not as a "best practice".
Don't do a single peer review process
As you're developing the workflow, I'm a fan of peer-programming and would suggest that multiple eyes look at the workflow during the creation process. Maybe you've run a review with the end-user/client and now that the results are 'right', you're ready to finalize the workflow and need to complete your preparation for production. You've been documenting the workflow since day one and preparing for production. If the purpose of the review is to ensure that the move to production is ready, you've likely got tasks to verify. Here are examples:
Browses are turned off
Workflow dependencies are UNC
Input/Output point to production
Any "testing" elements within the workflow are disabled
That's what I think of for a single peer review process.
Do do iterative reviews with objectives in mind
Are you getting the right results? Is your process efficient? Is your process easy to maintain? Is your process durable over time? Is your process auditable/testable? Going back to square one would be defeating if you thought that you were done and someone made you re-design your workflow to adhere to standards. In looking at workflows any two "artisans" will approach the challenge differently. Being able to defend your approach is a strength, but also being able to accept and adopt new techniques is also a strength. Sometimes people are intimidated by me and give lots of apologies before showing me their work. I learn a lot from seeing how they approach their challenge. I also consider the amount of feedback that I'll give them based upon their abilities. Some of my feedback includes questions about the future and what improvements they might consider, rather than things that should be addressed immediately.
I saw a "monthly" process that required the user to modify the workflow each month by copying containers onto the workflow and configuring the results to manually adjust the outputs. There was plenty of repetition in the workflow and though it was neat and organized, we discussed the need to configure the workflow to make the needed adjustments in a more dynamic fashion. After agreement on that, she told me that she was also challenged that the incoming field names couldn't be trusted to be in a single format. This brought us to testing. We needed to either test the incoming data or to make needed adjustments to naming standards (e.g. sep vs SEPT) within the workflow. I didn't see a single test tool within the workflow. Though this is common, it is always a flag for me. If something is running with the lights out, how do you know that the data is accurate. As @Ken_Black says, "All data is guilty until proven innocent".
My final thought: During the peer review process, if things are not changing then the process likely isn't worth having. That's my opinion.
Alteryx ACE & Top Community Contributor
Chaos reigns within. Repent, reflect and reboot. Order shall return.