This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Once my system converts generic data to my format, it logs the converted / altered files in my format and then compares it with subsequent generic imports so that only new entries are run through the conversion process (which is extensive). The basic structure works great for when new data is present, but once all of the generic data has been converted a user has to remember to turn off containers to avoid field errors from zero records going through the format conversion process.
Does anyone have an example of a workflow where a detour is being triggered by a count of zero records? It is my first encounter with the detour tool, and thus far I am failing miserably!
The detour tool is normally controlled by an interface tool to specify if you want it to detour to the right.
An option would be to add a Count Records tool and append this onto your data, after this have a filter where the condition is set to count records > 0. When that condition is true you can process your records downstream otherwise do something else when the condition is false (i.e. send a error message).
Chris Check out my collaboration with fellow ACE Joshua Burkhow at AlterTricks.com
Not sure if the detour tool will give you what you are looking for, but an alternative would be to use a count records tool to count the number of records, then append that number to your main data stream and use a filter tool to check if the records count is greater than zero. If yes, pass the data through the format conversion part.
I think this will answer your question, although I would like to see what other users think. Hope that helps.
EDIT : I was slow, essentially try what @cgoodman3 suggests above
Thanks for the quick feedback! This is pretty much the approach that I have been taking up until now. The problem is that when there are no new records present, the filter approach still passes 0 records / 0 bytes downstream through my conversion modules instead of bypassing them altogether. This causes a run delay and sets off a string of missing field errors in the transpose / crosstab sections. What I really want to do is to either completely bypass or disable the entire container so that the workflow does not process that section at all...
It appears that dynamically disabling filters is only possible within a macro or app at this point. I decided to to join and union in text files that maintain the missing field names when they disappear because of the transpose / crosstab process. Not as clean as an automated bypass / container disable, but it keeps the errors and warnings in check and has a minimal impact on runtime.