
Hi Community members,
A solution to last week’s challenge can be found here.
This challenge was submitted by Yoshiro Fujimori (@yoshiro_fujimori). Thank you, Yoshiro, for this great challenge!
Many data analysts run multiple Alteryx workflows daily. Monitoring when each workflow runs, how long it takes, and whether it encounters any issues is crucial for troubleshooting and audit readiness. Sound familiar? Then this challenge is for you!
For this week’s challenge, you are provided with a series of Alteryx log files. Configure the Directory tool provided in the workflow start file to gather these files. Also, note that to read files from a directory, you need to use a Dynamic Input tool.
Your task is to analyze these logs, extract key operational metrics, and from these metrics, determine which workflow ran the longest.
Task 1: For each log file, extract:
- Workflow name
- Start time
- Duration (in seconds)
- Number of warnings
- Number of errors
- Number of conversion errors
Task 2: Across all log files, compute summary statistics:
- Total number of jobs (log files)
- Total combined run time (in seconds)
- Total number of warnings
- Total number of errors
- Total number of conversion errors
Sort by the Seconds field in descending order so you can detect which workflow ran the longest.
Hint 1: Logs may vary in structure slightly; pay close attention to how time and errors are logged.
Hint 2: When joining the log files and the extracted data files, be aware of field names. You need to create new fields for this.
Need a refresher? Review the following lessons in Academy to gear up:
Sorting Data
Multi-Row Formula
Summarizing Data
Good luck!
The Academy Team