Hi,
As mentioned in the title, I am preparing a file that will incorpoarete new data each week from a source, so basically it is a data dump from a survey platform, which will update with new info (stacked on top of each other) each week (or any other interval).
What I am trying to achieve is to monitor what has changed week-over-week between the new dataset and th old dataset (previous week's saved file).
I bascially want to move the previous week's main dataset into an archive folder before this week's main dataset is saved in another folder.
I have tried the run command but for some reason is it just not working out.
I want to use the previous week's file in a new workflow to compare against this week's new main dataset and see what has changed.
TIA
Make sure to thumps up if it solves your question.
Hi Warcry,
Thanks for the workflow.
So I am running two workflows, the first one is the main workflow which will clean and manipulate the data with the new data each week.
The second should be the comparisson workflow.
Will this workflow you provided archive the latest main dataset first before pulling in the recent update to the main dataset to compare?
Your inquiry: I basically want to move the previous week's main dataset into an archive folder before this week's main dataset is saved in another folder.
This sample answers how to configure the run command tool with the cmd command to move your file wherever you need it to go. Of course you'll need to make the needed changes.
Hi Warcry,
Thanks for the response.
So I have two workflows in one, the first one runs the main dataset which gets saved in a folder on our shared sever, with the second workflow basically being the runcommand workflow, which should move this newly created file from the main workflow, to another folder.