I'm looking to calculate Operational % of a thing.
This thing travels to different locations and has Many "States/Status's" is can be in.
Information comes in sporadically so we want to assume its last Status then calculate the time between the dates as shown below then Calculate what % of time in each state (% operational being specifically time it has been UP and used).
We also want to be able to see if that item spends more time DOWN at a specific location to look for trends in locations.
I can calc this in Tableau BUT it is too taxing for the vis given there are 2 million lines and counting daily (increase of 60k lines a week)
it is important the we see the parts lineage( location ) and operational time but I'm semi new to Alteryx and don't know the route to take.
Date | 1-Jan | 2-Jan | 3-Jan | 4-Jan | 5-Jan | 6-Jan | 7-Jan | 8-Jan | %Operational |
ID 1 | UP | no report | no report | DN | no report | no report | UP | no report | |
Assumed info | UP | UP | UP | DN | DN | DN | UP | UP | 66% ish (don't judge my eyeball math) |
ID 2 | DN | DN | no report | no report | no report | no report | DN | no report | |
A | A | A | B | B | B | B | D | ||
Assumed info | DN | DN | DN | DN | DN | DN | DN | UP | 3% ish |
A | D | B | C | C | C | C | C | ||
ID 1 Health BY location example | |||||||||
A | 70% | ||||||||
B | 75% | ||||||||
D | 100 |
Solved! Go to Solution.
Hi @pitmansm
I feel like the structure of the example data has hindered responses from the Community. Let's see if I understand this correctly:
- The "Assumed info" is the assumed state of the "ID #" values above, yes? Would it make sense to have that directly adjacent to those ID values, or even replace them?
- The A, B, C, D values are locations at which these status values occur?
The first thing I would recommend is to transform this data a bit to make the calculations very easy.
-Let's talk about the sequence of rows. Is it possible for the rows to always be grouped together? Perhaps in the sequence Status, Assumed Status, Location? The order of rows in your example data leads me to assume that the first Location row should be related to "ID 1", but I don't know this for sure. It is possible to correct for this in Alteryx, but it can be tricky and (more importantly) circumstantially unstable.
We can talk more about data transformation if you'd like, but the data schema I'd recommend getting to is this:
Once the data looks like this, the calculation you've requested are very simple and just take a couple tools. First, I assign a 1/0 to the Assumed status, and then we can use a Summarize tool to group by a field and average the new 1/0 field.
Check out the attached workflow to see an example in action.
Hi @pitmansm
I took a stab at getting your input into long format. Assuming an input that looks like this
This workflow will pivot your data and then applying @CharlieS' very clever data mapping that avoids having to calculate totals and then dividing, gives an output close to what you're looking for
Dan
OH yea the Data is in row format. And is shows like you described for sure. The Assumed portion is there to show the desired Logic. the data as it flows does not show the assumed data. Ill gen up an example of exactly how it looks. @CharlieS
Part | Status | Date |
1 | UP | 1-Jan |
2 | UP | 1-Jan |
3 | DN | 1-Jan |
4 | DN | 10-Jan |
3 | DN | 10-Jan |
2 | UP | 10-Jan |
6 | up | 13-Jan |
1 | DN | 13-Jan |
2 | DN | 13-Jan |
My apologies as the intent was to show concept. My data is spread across 1600 csv's Im good combining it all just a matter of calculating the time between Status's and Assuming the last status that has me stumped.
Opp yea my apologies date and location are derived from file name.
I have a good grasp on how to agg these csv's pull the date out of them and the location from the name. The Operational % is the kicker for me.
Its a difficult data set to cleanse since I have to completely make up the data to give an example lol
@CharlieS I appreciate your time!
Yea for sure. what made it click for me was the Average inside the aggregation. Thank you both for the help!