Hello,
I have two containers that have the same structure with different input sources. The first one would get the input data from Table A and sends it a Python tool connected to the Run Command to pull data from SAP automatically. The second one does the same thing but gets the data from Table B. Since I can only have one instance of SAP open at a time, I would have to run one container while the other one is disabled and vice versa. I can't use the Control Container either because there's not output from the first container to the second - hence, the no valid metadata for outgoing connection 1 error message.
I'm trying to figure out a better solution to disabling containers one at a time. My one solution is to use the Iterative Macro, but I'm not sure how to create one that would determine when the Run Command is done in first container. My other solution is to put a pause in between the two containers with a macro, but I'm not sure if that's the best solution. My last solution is to incorporate some sort of Python script to do this - which I'm not sure how to approach this either.
Could you please provide some suggestions on how I can approach this for best practice?
Many thanks,
kwl
Solved! Go to Solution.
You should be able to accomplish this with a batch macro - try this one out here (I think the updated working on is in the comments): Dynamically enabling / disabling workflow sections - Alteryx Community
Hi @alexnajm
Since my workflow relies on SAP to pull data, how do I set up the Batch macro to understand when the Run Command is complete?
Thanks,
kwl
It shouldn't kick off the second container until the first container completes!
@alexnajm , I'm new to using macros. Could you please explain how I should setup the macro to understand when the completion in the first container happens? I'm having data pull from SAP, so how do I instruct the macro to understand that when it's done?
Thanks,
kwl
Have you tried the block until done tool yet? Control containers would also work if you're on 2023.1, but regular containers will not.
The Block Until Done tool doesn't work because the containers rely on separate input sources. The Control Container doesn't work either because the second container relies on the output of the first container. In my case, the output in the first container is files being pulled from SAP.
I'm wondering if because once the command is passed along to the run command tool it considers that part of the flow complete. Those tables are pretty big in SAP and could take quite a while to complete. There are a couple of CReW macros you could try. Wait allows you to pause the workflow. Conditional runner may be able to help. But if the issue is that things outside the workflow are preventing it from running (i.e. logging into SAP), I would separate the two pieces and run separately or set up the second workflow to kick off after an event.
@alexnajm - I'll explore the batch macro and see if I can make it work. Thank you!
@jdminton - I just learned about the CReW macros this week. I'll explore them as well. The Control Container works great if you have an output, not so great otherwise with the connection issue. I thought about using pause, but that would require knowing how long it will take to run the first container. I have a solution so far by not using the Run Command, but it's not the best solution. Thank you!