Hi,
I have a workflow that reads files from an FTP server and stores them in the C: folder, this flow runs multiple times a day since the files on the FTP gets updated on a daily basis. So far so good.
In addition, I have another workflow that uses these files from the C: folder as input data. My question is, in the second workflow, should these files be specified as Workflow Packages or should I specify them as Network Files from within the Analytics Hub after saving my workflow to the Hub?
Currently they are specified as Workflow assets but it seems like the flow is reading the file in the state it was when saving the workflow and not the latest version of the file.
The files will need to be saved to a network location that the Analytics Hub can access. The Analytics Hub will not be able to access files saved to your C drive as this is (generally) a local drive.
I would recommend:
1) Saving the files to a shared network folder that the Analytics Hub can access
2) Configuring the 2nd workflow to read from this folder
Let me know how you get on or if you have any questions
Hi @jamielaird,
Thank you for your answer, I have a follow-up question to that. In a separate workflow I use 10 macros and each of the macro have five input files. When I save this workflow to the Analytics Hub, will I need to specify every single one of these 50 files as Network files after saving to the Hub? Or is it different when there are macros involved?
Best,
Sandra
Hi @sandrajansson ,
It is somewhat different for macros since the data flowing into the macro comes from the parent workflow. If you have configured the parent workflow to access network files, then these are what will flow through into the macro.
For any template files used by your macros (which define the structure of the data your macro expects) these are OK to be packaged up when you publish the macro (as this data does not need to be dynamic) but you could also read them from a network location if you prefer.