This is more of a best practices issue than an intractable problem, but I thought I'd see how other people might approach this:
I pull in new sales data weekly to append to a dataset I prepare for reporting. Due to the idiosyncrasies of sales patterns and our data warehouse some weeks this will mean I pull six days of data, others I pull seven. Normally I just manually adjust the filter tool in the attached screenshot between '<' and '<=' depending on the dates present in each edition of the sales data, but I'm sure there's a better way to configure this.
Would using a combination of temporary columns and filtering out duplicates after the join be an efficient way to approach this? Using a Unique tool seems like it may be prone to error, but maybe I'm overthinking it and should just take the time to fine-tune a workflow segment using Unique.
What are the conditions under which you would pull six days vs. seven? Workflow solutions are only viable if you can extract those conditions from something in the data.
Non workflow solutions I might try depending on conditions include setting up 6/7 as an input in a config file or manipulating max date upstream somewhere so that the filter can always treat it the same way and you put the adjustment into its own workflow section.
It depends on if the batch load for our sales database is current or not. Sometimes it will have the past seven days; others it will be missing the most recent date