This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
We are running into an issue with a chained app in the Gallery that passes data through the chain.
As each app in the chain creates a separate temp file, we are using a network drive so save the files from app#1 that are picked up in app#2 or app#3 etc.
The issue comes in with concurrent users. Depending on selections made, User#2's project may complete before User#1's project causing User#1's data to be overwritten and hence providing incorrect results.
Is there a way pass the information from app#1 to app#2 without writing to a temp file on a network drive, or to identify the temp folder used by app#1 in subsequent apps in the chain?
I have this same problem. I have taken a GUID type approach, using the __CloudUserId script to identify the user and create output files from workflow 1 with the users Server id. However I am having trouble figuring out how to get workflow 2 to dynamically select the users input files for a drop down or list box, using an external source. Do you know of a way to update the below file path dynamically in the workflow.
I am having the same issue as @KMiller . I am able to create a unique file to send to the second app but I'm not sure the best way to get the second app to know which unique file to look for. Can you elaborate more on the GUID and subdirectory method you mentioned in your previous post?
I have had success with the following approach that works for the gallery:
1) Put all of your workflows in the same directory. So if Workflow 1 kicks off workflow 2 which kicks off workflow 3, save them all to the same folder.
2) Have Workflow 1 output relative files for use in workflow 2. For example, workflow 1 can output to file1.yxdb which workflow2 will then use. Since no folder is specified, it will save in the same directory as the workflow.
3) In the interface designer settings, type in the filename of your second step. Again, no folder means it's a relative path.
4) In your second workflow, have any input/interface tools that are getting data from your first workflow use a relative reference just like step 2. You just want the filename.
5) When you save workflow 1 to the gallery, be sure to check the box in the workflow options >>> manage workflow assets for your successive workflows. I don't think it should matter whether you check the box for your relative outputs.
This works because when alteryx runs the workflow in the gallery, it will spin up each instance in a separate temp folder. So there is no chance of concurrent users tripping up over each other. And all of your steps are run from the same temp folder, so they can all read/write to the folder.
This would be much simpler than other approaches explained here.
I'm not certain all workflows in a chain would run on the same worker server, blowing the relative file directory solution.
Also, I like having the passed data available on a shared drive so I can audit, test, and debug easier
Right now I'm getting __cloudUserid in every workflow in the chain to build into workspace directory names.
I wish we could get the __cloudUserid only in the first app of a chain, and add a timestamp to it, giving a unique meaningful subdirectory name that could pass to the next step in the chain..so I posted the idea and hoping for the best.