Hi
We have many workflows where we condolidate a group of files. And we consolidate all files every time.
This is a big wate of time, since old files rarely change.
Therefor I wanna find a good method where I can consolidate all files in a folder. And the next time I do the consolidation I only add whats new, and remove the old file from my alteryx db - and add it again if changed.
Ex:
And ideas?
Hi @Hamder83
Here is what you can do.
1. get all filename using the directory tool.
2. extract date from the file name. And filter the files which only which are latest (you can have temp file which stores till where the last data was stored)
3. read only the filtered files using dynamic input tool.
Helpful guide : https://community.alteryx.com/t5/Alteryx-Knowledge-Base/The-Ultimate-Input-Data-Flowchart/ta-p/20480
Hope this helps : )
Hi @atcodedog05
I feel the tricky part is:
I load all the files - and I save the lastwritetime (to a coloumn in the file?)
But this means that I need to load the file, to be able to match it with the data from the directory?
Where I have tried on my sql server to do this:
I read all files and lastwrite time from the server, compare that to the directory. If the file excist but the lastwritetime has changed, i generate a delete qquery, and then I re-run the remaining data.
But how in pactice would you do this, with files?
Without adding it as a column in the file?
Hi @Hamder83
Just a have separate yxdb file which stores only the last write date this way it always reads only a file which might have only 1 row (last write date)
Hope this helps : )
Hi @atcodedog05
maybe im missing comething, but I dont see how to solve it correctly. If you have a chance would you take alook at my sample flow, and maybe correct it?
Thank you