This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have a table with several measurement fields (columns) for each row (entities). Each of the measurements has a lower bound and an upper bound that I have to check the values in the main table against. I'm not sure of the best way to do this.
These boundary values are stored in a separate table:
I've come up with one solution but I'm not particularly happy with it. It seems to be inefficient. I'm fairly new to Alteryx so I'm sure there's a better way -- there's a lot I don't know!
In my "solution" I took the boundary values table and applied two Crosstabs to separate the upper & lower boundary values and flatten each into a 1 x 3 table. I Appended these together, then Appended the resulting 1 x 6 table to my original data table, so that I could now use a Formula tool, or apply Filters, etc.
But it bothers me that I'm duplicating the set of boundary values for each row in the main table. In my real-world file I have 12,000+ rows with 83 measures to validate. That's a lot of redundancy!
I'd be delighted to hear of a better way to accomplish this task.
A simple solution for this is to Transpose your entity table. Group by Entity. Then you just have a generic name field that will have measure 1, 2, n in it. Then you join to the boundaries table on Name (main) = Measure (support). From there you will then get the lower and upper bound for each measure but only as two columns instead of 100s.