This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have a workflow that reads files from an FTP server and stores them in the C: folder, this flow runs multiple times a day since the files on the FTP gets updated on a daily basis. So far so good.
In addition, I have another workflow that uses these files from the C: folder as input data. My question is, in the second workflow, should these files be specified as Workflow Packages or should I specify them as Network Files from within the Analytics Hub after saving my workflow to the Hub?
Currently they are specified as Workflow assets but it seems like the flow is reading the file in the state it was when saving the workflow and not the latest version of the file.
Thank you for your answer, I have a follow-up question to that. In a separate workflow I use 10 macros and each of the macro have five input files. When I save this workflow to the Analytics Hub, will I need to specify every single one of these 50 files as Network files after saving to the Hub? Or is it different when there are macros involved?
It is somewhat different for macros since the data flowing into the macro comes from the parent workflow. If you have configured the parent workflow to access network files, then these are what will flow through into the macro.
For any template files used by your macros (which define the structure of the data your macro expects) these are OK to be packaged up when you publish the macro (as this data does not need to be dynamic) but you could also read them from a network location if you prefer.