I have a series of workflows that apply business logic and create output files based on database extracts. The extracts are spread across 3 different workflows and total almost 90 different queries. The workflows are pretty simple - it's basically just Input Data -> Select -> Output Data (as yxdb), in 90 different groups (like the screenshot below). This probably isn't ideal, but I inherited this from a previous developer and it's generally run fine so I haven't invested the time to clean it up.
However lately there have been some performance issues which I believe are coming from the DB, but with so many extracts it's hard to tell where the bottleneck is.
Is there a better way to go about this? Batch macro maybe, but not sure if that will help performance.
Batch macro will just make the workflow cleaner and more dynamic. I will not affect performance. And since the workflow is really simple, I don't think there is much to improve here. Maybe if there is a query on the input tool you could optimize that query.
Thanks, that's more or less what I thought. I did enable performance profiling so that will highlight if a particular table is the issue.