Hi ,
Thanks for your time. I have this scenario where i need to look if an ID exists in the prior month or prior consecutive months or in any month and the months are not consecutive....
I have the dataset something like below.. Inventory Number which will have rows when there is failure...each inventory number will have multiple records (12 months) if the inventory number is failed in each month then there will be 12 records if it is not failed at all then there is no record at all for that inv number.. what i need to derive is something like MonthlyFailStaus and MonthlyFailStatusDesc so that the records can be grouped into certain buckets
Failed only once - This particular inventory number is failed (think of it as a job failed for this server) only once in the entire 12 month (there is only 1 record)
Failed few times in the same month - This inventory number failed few times in the same month say for ex: Inv123 job failed on 5th, 10th, 25th
Failed in consecutive 3 months - This inventory number failed in the consecutive months (calendar is Jan to Dec) - say for example it failed in Jan, Feb , March
Repeating Fail - This inventory number failed few times but the months are NOT consecutive say for ex: inv123 failed in Jan then failed in Apr then failed in Jul - these are not consecutive
Failed in consecutive 2 months - this inventory number failed in the consecutive months like Jan, feb or any month like Oct,Nov (failed 2 times in 2 months straight)
Failed in more than 3 Consecutive Months:this inventory number failed more than 3 times in the consecutive months like Jan, feb, march and april
any suggestions on the best approach.. i have few diff data sources in alteryx workflow where i use join and union --thanks a lot
Solved! Go to Solution.
Hi @Raj_007 ,
I've created a workflow that I think answers your question.
Since we are mostly interested at what months and year each failure happened, I've used a formula tool with a datetrim function to force each start date to the first of the month.
That allowed me to then use a summarize tool to group on the inventory number, group on the new field and count the number of failures occurring for each inventory each month
Then, with a multi-row formula tool, I was allowed to decide whether for a particular inventory, those failures occurred in consecutive months and if yes, then flag them with a "C". If not, flag them with a "N".
Note how the first failure for each Inventory No does not have a flag assigned. That will be important later on. Then again by using a summarise tool, I was able to group on the Inventory No and concatenate the field containing the flag.
Hence, now I know the following:
The final step was to implement this logic inside a formula tool. It's not very delicate, but it get's the job done.
Let me know if that was what you were after and please feel free to ask any questions. Thanks for the challenge.
Regards,
Angelos
Hi Angelos,
Thank you so much for your time, let me first try to understand so please give me sometime to look at it and see how it is working... i will get back to you for any questions then we can flag this thread as closed
Hi Angelos,
The sample dataset is just for testing. Say for ex: we would like have the dataset as is in the final output file (say we write the final output to an excel file)
and we would like to keep the dataset with duplicates the reason is we also have some other columns where the inventory number will have duplicates because of some diff values in a field for ex;
Inv12345 which has 3 consecutive fails and some other columns like status has values like cancelled, in progress etc.
what that means is I still want to keep all the records with duplicates but flag or compute what you provided as solution - what are my options
i do not want just the unique inventory numbers i want everything with duplicates also
like below - thanks for your time
InventoryNo | StartDate | MonthlyFailStatus | MonthlyFailStatusDesc |
1000101 | 1/10/2020 | Single | Failed only once |
1000102 | 1/10/2020 | Multiple | Failed few times in the same month |
1000102 | 1/15/2020 | Multiple | Failed few times in the same month |
1000102 | 1/21/2020 | Multiple | Failed few times in the same month |
1000103 | 1/15/2020 | Multiple | Failed in consecutive 3 months |
1000103 | 2/10/2020 | Multiple | Failed in consecutive 3 months |
1000103 | 3/10/2020 | Multiple | Failed in consecutive 3 months |
1000104 | 1/25/2020 | Multiple | Repeating Fail |
1000104 | 5/10/2020 | Multiple | Repeating Fail |
1000104 | 7/10/2020 | Multiple | Repeating Fail |
1000104 | 10/23/2020 | Multiple | Repeating Fail |
1000105 | 8/15/2020 | Multiple | Failed in consecutive 2 months |
1000105 | 9/12/2020 | Multiple | Failed in consecutive 2 months |
1000106 | 10/15/2020 | Single | Failed only once |
1000107 | 4/15/2020 | Multiple | Failed few times in the same month |
1000107 | 4/21/2020 | Multiple | Failed few times in the same month |
1000107 | 4/29/2020 | Multiple | Failed few times in the same month |
1000108 | 4/15/2020 | Multiple | Repeating Fail |
1000108 | 6/15/2020 | Multiple | Repeating Fail |
1000108 | 8/15/2020 | Multiple | Repeating Fail |
1000109 | 6/15/2020 | Multiple | Failed in consecutive 3 months |
1000109 | 7/21/2020 | Multiple | Failed in consecutive 3 months |
1000109 | 8/30/2020 | Multiple | Failed in consecutive 3 months |
1000110 | 7/15/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 8/13/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 9/12/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 10/10/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 11/10/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 12/12/2020 | Multiple | Failed in more than 3 Consecutive Months |
1000110 | 1/13/2021 | Multiple | Failed in more than 3 Consecutive Months |
Hi Angelos, thank you so much. It really worked. find and replace tool - never used it. i thought we might have to do some joins
but it was really simple with find and replace tool - thanks a lot
User | Count |
---|---|
17 | |
15 | |
15 | |
8 | |
6 |