Hi All,
We have a scheduled workflow that is set to run at 3.30am every morning, to extract data out of a data warehouse. To minimise the risk of the extraction failing if/when the warehouse errors I've built into the start of the workflow a records check test and ticked the 'Cancel running workflow on error' option in the Runtime settings.
Lately we've had some tables not fully refreshing, and it looks like it's due the increasingly sporadic refresh finish time of the warehouse. It's finishing any time between 2.40 and 3.20am, but last night it finished at 4am!
I feel like I have a couple of options:
Add a records check to each input (time consuming and could result in disjointed data spanning two days)
Move the scheduled time of the extraction workflow, again, with the likelihood of it needing to be changed again in the future
Find which table in the warehouse refreshes last, and apply the records check to that one and make it the first extraction to run
Just wanted to check if there was anything else I'm missing/alternative suggestions people may have!
Thanks in advance!
Hi @DavidSkaife I've seen a number of ways this is tackled the first is more of a brute force option which is have multiple schedules do a check then have some logic in the workflow that decides if it wants to run the refresh or update it because there is new data available. The other way is to have a event monitoring program such as Control M trigger when a event has occured such as tables being populated and then use the gallery APIs to trigger the workflow.