I did the same as most, but went for separate filters.
@andyuttley I believe it's the way that the expression is compiled in the Alteryx Engine.
If you have them all in one filter, each record is checked against each condition. So in this example all 300K records are checked against all 6 conditions in one filter.
If you have them in separate filters records are removed from the process in a waterfall way.
It's most efficient to have your biggest remover (made up that phrase, but the condition that most records fail) as the first filter and then continue with that approach and then there will be less records for each condition to be checked against.
It's not really going to make a difference on only 300k records.
I just did some testing, 22.5 seconds on average for one filter and 21 seconds for 6 filters (in the order in the post, not the most efficient).
Which is quite considerable actually, as its the auto-field that takes most of the time in my workflow (performance profiling says 18.5 seconds).
Although, what is weird is that if you add up the individual times of the performance profiling, you get 145ms for 6 filters vs 117ms for 1 filter.
My guess here is that as the timings are so small, the actual overhead of the performance profiling added to each tool slows it down.
As the workflow consistently runs quicker with 6 filters and no performance profiling.
I found this out on processes years back with hundreds of millions of records. So we are talking really small margins with 300k records.
Anyway, mine was probably similar to the others.
* "Patiently" awaits the next instalment