I have a macro that, when used, takes user input via Interface tools and makes API calls. I've set this up as a batch macro so the user can provide multiple rows of parameters that each correspond to a single API call, getting the results all "unioned" together back at the end. However, I want to prevent poorly formed parameters from even turning into an API request to avoid spending quota.
The Error Message tool seems designed to help with this but only as an analytic app. When triggering the error, it looks like the internal app workflow doesn't even run. However if I use the Error Message tool in a batch macro inside a normal workflow, then the Error Message turns into an error log error -- however, the macro's contents still run (and generate an API request, which immediately errors out).
Is there a way to have the workflow in the macro stop execution on error? The macro itself doesn't have a Runtime "Cancel running workflow on error" option.
The only (inelegant) solution I can think of is to embed another batch macro in the first batch macro. The outer macro accepts the user input parameters, forms it into a single record, QAs the parameters, and if there is an error, doesn't pass anything to the inner batch macro. Otherwise, the record containing the parameters passes through to the inner macro -- which executes the actual API call. Any alternatives?
Solved! Go to Solution.
Hi @Scott_Snowman,
never faced this problem, but an idea would be to use a filter tool and test with this, if the test fails then no data output, another solution may be to use the crew macro pack and use the "blocking test macro", and last would be to use the message tool wiht message type " error - and stop passing records threw this tool".
Hope it helped!
Try this technique
Put your validation code on output 1 of a Block Until Done tool and the rest of your workflow on output 2. With the error tools set to Error and stop passing records, the Macro halts if one of your message tools fails and leg 2 is never executed. The calling workflow may continue if you don't have it set to Cancel Running on Error, but the rest of the code in the Macro shouldn't execute.
Dan
@danilang and @Ladarthure , thanks so much for the feedback!
I'm sorry for not being more explicit, that's what I get for writing a post right before heading to bed.
Both your approaches would normally work but the scenario I'm working with is similar to the below screenshot. There are multiple interface tools connecting right to an existing connector that I don't want to reverse-engineer. (I'm using the SalesForce connector in this example as a mock-up but it's not the SalesForce connector in the actual workflow.) So there is no workflow "before" the API request to check errors on. (That being said the Block Until Done technique you both suggested is a really good one for some other use cases I have!)
Any way to prevent the underlying connector from firing if the values in the Interface tools are malformed? If need be I can pass the results of the Interface tools to update a dummy record, QA as you've both described, and then feed that updated record into a macro that both updates and runs the connector, but I'd rather not "macro within macro" if I can avoid it.
Thanks!
Hi,
to my mind, it seems that you are on the right way, doing a macro within a macro, so that you can have more control over the queries you want to pass. I don't see any other solution, appart from reverse engineering the tool which could be complicated (more than a macro in a macro)
User | Count |
---|---|
18 | |
14 | |
13 | |
9 | |
8 |