This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
So I have a file that is delivered by someone on another team creates. It always has the same name and is always delivered to the same directory location. It is typically updated every workday but this time of the update is always changing.
I want to pull it in whenever it is updated and aggregated together to become a historical file of all the data delivered each day. I can run a workflow to pull it in and I'm able to schedule this to run. Ideally I want the output to be a .YXDB file.
The problem I'm having is that although I can schedule it to pull hourly, this causing tons of duplicates to be pulled in from the file which doesn't change daily. If instead I made the scheduled workflow run hourly, then it could miss a file if wasn't updated until after the time of the scheduled run.
Is there a good way to append files based on the LastWriteTime attribute of the file? I have thought about joining against all the fields in the file, but I'd imagine that join would get very slow as the data got larger.
Yes, that is exactly what I'm trying to do. I'm trying to use the Directory and the Dynamic Input tools together to bring in only new records. My trouble is that the records don't have their meta-information besides the FullPath after they've passed through the Dynamic Input tool.