Have you ever built a workflow only to find out later that you’re losing a bunch of your records after trying to Join a couple files together? Do you ever wish you could check the number of records as they pass through your workflow, and be alerted when records get dropped? Or even stop the workflow when there’s a value that is too high or too low? With Alteryx, you can set up these checkpoints and alerts, and focus on your project instead of spend time trying to find where your workflow is leaking records. You can use a Count Records tool where you need to check the number of records, or use a Test tool when you want to check to see if the number of records matches a value or meets some condition.
What we will use:
An example workflow is attached to this post (shown below); simply open into Alteryx and run it.
In the screen shot above, I’m using both the Test tool and the Count Cases tool for three use cases:
When you join tables together, you don’t expect to lose cases that you thought were going to match. Usually, you expect all the cases in one file to match with a case in another. But what happens when some IDs in one file are entered incorrectly? You’ll risk being unable to match those cases, and you will lose them without even knowing it, which could pose quite a problem! This is something that can happen when you use the Join tool and is something we want to avoid. So, I’ve found it helpful in these cases to set up an alert with the Test tool which will tell me if the number of cases that fail to join is greater than zero; if any cases flow through the left or right output anchor of a Join tool, that will trigger the alert.
The example I have comes from the HR space. Workers at a company responded to an employee satisfaction survey, and the results are stored in 2 files – one stores the answers to “Strongly Agree – Strongly Disagree” type questions. The other file contains their comments to an open-ended question. We need to match the comments to the right employees so we can do some more analyses on the combined data. And to do that, we will use a Join tool to match records by Employee ID.
Here’s how to do it:
Notice here that there are 332 records in the Employee Satisfaction data, but we have only 309 records of employee comments. The Join matched 309 comments with the employee survey responses. But we also lost 23 survey records from the Left output anchor; not all employees provided a comment. The Test tools we just set up will alert us if the record count coming through either output anchor is not 0. And because there were 23 records coming through the Left output anchor of the Join, the Test throws an alert[i].
(Note: In the attached workflow, you’ll notice I’ve included an identical workflow in a separate container, but one where there are no records lost in the Join. There are 332 employee records, and 332 employee comments. All records match, so no alerts are thrown. To view this, open the container under the workflow and you can use the shortcut Ctrl-R to re-run the whole thing; you’ll see there are no lost records, and so no alert gets thrown.
In this second use case, I want to be alerted when 5% or more of my employees are working either too many hours (or too few) compared with the average number of hours worked by their department. If I have too many employees working too many hours, that would suggest I need more employees; if too many employees are working too few hours, that would suggest an opportunity to train them to increase their skill set and cover more tasks.
So, imagine that you need to be alerted when a condition or comparison – maybe like the one in this use case – needs to be met, but fails. Again, you can use the Test tool to do this; in addition to testing for one value like we just did, the Test tool can also determine if more complex conditions are met.
For this example, we want to do two things:
The following part gets a bit technical; Alteryx users who are familiar with data prep might want to skip to the next section about using the Test tool to test for a condition.
First, we need to calculate the values for over- and underutilization, by job type. I’ll define an overworked employee as someone who works more hours than twice the standard deviation above the average for their job type, and an underworked employee is someone who work less hours than twice the standard deviation below the average for their job type:
Now we can test to see if 5% or more of employees are working too many or too few hours. So, let’s attach two Test tools to the Formula tool and configure one to add a test called OverUtilization. Choose the test type for “Expression is True for First Record” and we’ll define the Test value as [OverUtilized] <= [Threshold]. Let’s do the same for the other Test tool, but here we’ll define the test value as [UnderUtilized] <= [Threshold].
Neither of tests throws an alert at this point, so we don’t have to worry that there’ll be an impact on headcount, which is a good thing! So, now we can add an Output Data tool to save the file and use it later for more analyses.
So, just as we did in this second use case, in a similar way you can use it to test for any number of conditions you want, and whenever you need to make sure that the number of records passing through your workflow doesn’t exceed a defined condition, or maybe does meet a defined comparison.
For this last use case, we need to be sure that no records were unexpectedly dropped at different points along the line. We need to be sure our record counts are what we expect them to be. So, I’ll use several Count Records tools placed at different points along our workflow and use a Test tool to check if the final record count equals the number of records counted at an earlier point in the workflow (shown below).
Notice in the sample workflow that I use wireless connections[ii] for each of the Count Records tools in the workflow. The first Count Records tool (at the top) is connected to the Input Data tool for the Employee Data, which is at the very beginning of the workflow.
The Count Records tool is so easy to use, you don’t even need to configure it. But be careful, because it outputs only one cell of data – the record count – and names that column Count. In my example I have six Count Records tools so I will need to attach a Select tool after each one so I can rename Count to something like UploadedEmployeeRecords which describes the record count when I first uploaded the employees’ survey data.
Now let’s use the Join Multiple tool to join by record position and combine all record counts into one row of data.
Your results will look like the screen shot below:
So, you can see that we uploaded 309 comments and 332 survey responses, and then joined the two files by matching on Employee ID. Three Hundred and nine records joined successfully, but we also lost 23 records from the survey data file. Well, this makes sense because remember that in the top workflow example, there were 23 employees who did not offer a comment[iii].
Now, if we want to check the final record count and compare it to the count earlier in the workflow, you can add a Test tool to Join Multiple tool, choose test type for Expression is True for the First Record and use [SuccessfullyJoined] = [FinalOutputRecordCount] as the Test value. If the record counts at these two points are different, the rule will throw an alert. And you can do this sort of value checking throughout your own workflows, to ensure that the record counts at the start, throughout and at the end are what you expect them to be or alert you if they are not.
________________________________________
[i] If you wanted to stop your workflow from running completely when an alert is thrown, you can click the checkbox in the Runtime settings of your workflow configuration for Cancel Running Workflow on Error.
[ii] More about wireless connections here: https://community.alteryx.com/t5/Alteryx-Designer-Ideas/Wireless-Connections/idi-p/1367
[iii] If you needed those 23 records anyway, you could use a Union tool to merge the 23 cases back into the data, and their values for the Comment column would be null.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.