This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!
I run reconciliations on a daily basis to make sure we are putting in quality work. If the report finds an error, it will create an xlsx file via the Basic Table then Render tool. As the team researches the recon, they'll put notes into the xlsx file. My goal is to bring those notes over to the next days recon, which I can do with a Dynamic Input tool and join it together with the results of the days recon. The problem I'm having is when the file doesn't exist because there are were no errors the day before. How can I have workflow check for an error and run certain code if there is no error.
I feel like the solution is in the CReW Macro's. The issue I'm having is my workflow begins with the salesforce input tool, which doesn't have an incoming connection so how do I go from the Runner tool and kick off the salesforce input tool? And how do I keep the Salesforce Input Tool from running
I'm running Alteryx Server 2018.3 on Windows Server 2016 Standard.
I'm still a little unclear on what exactly you are asking for overall, but I think that you should look into the parallel block until done tool. Basically how it works is that the tools from he second (2) I/O will not run until all of the tools connected to the first (1) I/O are complete:
This way you can stop the Salesforce tool from running from (2) until everything is done from (1). One thing to note is that Designer will still try to follow the tool order of operations, so I make sure that the The tool IDs of the tools going into (2) of the Parallel block until Done tool come after the tool IDs of the tools going into (1) of the parallel block until done tool to make sure the (2) stream doesn't try to run first.
@JakeS, thank you for the response. I had looked into the Block Tool but decided I couldn't use it. The problem is what to do if, say, the first input doesn't exist... there may or may not be two files. If only one file exists, I want the workflow to continue without the second file. However, if the files exist, I want it to run with both files.
An additional nuance is that one of my inputs is a file and the other is the salesforce connector. In theory, the salesforce connector will always succeed. it's the file that may or may not exist.
I understand the issue now. I think the best way to handle whether or not a file exists in the second file would be to create a batch macro that checks to see if a file exists in that path, and if it does not exist, then don't send any input. To do this, you will have to create a dummy input that goes to a formula tool which checks your file path to see if a file exists. If the file exists, It will pass a value to a container (which contains the input data tool) in the batch macro telling it to either run the container or not depending on if the file exists or not. Whether or not the input data tool contains data, you will still need a value for the join to run successfully which is why there is a text input going to the output in the batch macro. I have attached an example of this which I hope will help you understand my previous description. Hopefully this helps with the problem that you are trying to solve!