2022.1.1.30569 Patch Release Update

The 2022.1.1.30569 Patch/Minor release has been removed from the Download Portal due to a missing signature in some of the included files. This causes the files to not be recognized as valid files provided by Alteryx and might trigger warning messages by some 3rd party programs. If you installed the 2022.1.1.30569 release, we recommend that you reinstall the patch.

Alteryx Designer Discussions

Find answers, ask questions, and share expertise about Alteryx Designer.
SOLVED

Stop workflow processing when Iterative Macro hits error condition

SvenD
6 - Meteoroid

I provide a list of URLs in a workflow to an Iterative Macro that will then query an API.

Some of the URLs will have multiple iterations, most will finish in without any (iteration=0).

 

 

SvenD_0-1648102585094.png

 

It all worked well until I started hitting a request limit and I am now trying to stop processing any URL once I am within a certain percentage. I have no problem stopping the macro from iterating, but I cannot stop it from processing each URL at least once.

 

My first thought was that I might be able to loop the 'E'(rror) output back to a filter like so, but I assume that this backward loop does not make sense.

 

SvenD_1-1648102953277.png 

SvenD_2-1648102991955.png

 

Since the Iterative Macro 'resets' with every new row, I do not know how I can stop the workflow from running.

I am sending a Error Message from the Macro and have set-up to cancel the workflow on error.

 

SvenD_3-1648103262942.png

 

The macro still works through all the rows of its input.

 

SvenD_4-1648103341182.png

 

Any suggestions would be very welcome.

8 REPLIES 8
clmc9601
12 - Quasar
12 - Quasar

Hi @SvenD

 

Do you know the rate limit in advance? If so, you could have your iterative macro exit the process once it reaches a certain limit. You probably have the macro set up right now to exit once it finishes all the URLs, right? An iterative macro will stop processing once there are no records left in the loop. So instead of having it process all records, keep a cumulative iteration count in the loop. Append the column to all the URLs at the beginning or end of the macro, then have a filter tool remove all remaining URLs once the limit is reached.

 

SvenD
6 - Meteoroid

I have accepted your suggestion as a solution too early. I should have had my tea first ;)

 

The problem I have is that I cannot pass any information to the next row - the problem is not happening between iterations, it is happening between rows.

 

This is what I imagine what is happening in the Alteryx engine at the moment:

 

SvenD_0-1648178482433.png

 

A list of URLs is passed on the the macro. A new instance of the macro (#1) is created with one URL as its parameter. It runs through its iterations and then outputs the data. Then the instance (#1) of the macro is deleted.

Then the next URL from the list starts a new instance (#2) of the macro. It runs through its iterations and then outputs the data (appended to the output of instance #1). Then the instance (#2) of the macro is deleted.

 

I cannot pass any information between the instances. My solution would have been either a global variable (which is not available) or some kind of feedback/extra column (like you would normally do between iterations).

 

I am thinking about wrapping the iterative macro in another iterative macro, which should give me a way of 'interrupting' the processing after each row as far as I understand.

 

Again any suggestions are welcome, including redesigning the process.

SvenD
6 - Meteoroid

Hello @clmc9601,

 

Thank you for taking the time to reply. It has been very helpful to think about my solution. I have not found a way to fix it, but I will continue to work on it.

 

Please find my answers below.

 

>Do you know the rate limit in advance?

I do not know the limit in advance. The remaining rate limit is sent in the reply header and is shared across the company.

 

>If so, you could have your iterative macro exit the process once it reaches a certain limit.

The problem I have is that I cannot exit any future rows that the iterative macro will process. I can stop it from iterating for each URL, so it only runs once per URL, but I cannot stop it from running again for the next URL once an API error was received.

Each time I output [Engine.IterationNumber] it always comes back as '0' (unless in the rare cases where I have to make 2 API requests for the same URL (paged results)). To me that means that the iterative macro 'resets' for each row, making it impossible to feed any data back (such as a 'do not process' flag).

 

 

>You probably have the macro set up right now to exit once it finishes all the URLs, right?

It is set up to finish once all data has been downloaded from the API for one URL - it only ever sees one URL and not the whole list.

 

Looking at my message log, it looks like - as described above - that the iterative macro is not iterating through the list, but rather creating new instances.
The log file shows Iteration '0'  ([Engine.IterationNumber]) for each URL.

 

>So instead of having it process all records, keep a cumulative iteration count in the loop.

I don't seem to be able to have anything cumulative as I don't seem to be able to pass any data from one instance to the next. I can pass data from one iteration to the next (which is needed for getting paged results).

 

>Append the column to all the URLs at the beginning or end of the macro, then have a filter tool remove all remaining URLs once the limit is reached.

 

Somehow I cannot get the API result to a filter in front of the macro.

 

Is it correct to say that iterative macros work through the URLs one row at a time when the input is given a list of URLs?

That means, the macro only sees the data in that one row and will run within its closed scope. I can feed data back via the 'iterative output' but this is only possible within that specific row. It is not able to access the list of URLs yet to be processed? It will only get the next row once processing for the current row completed?

 

 

clmc9601
12 - Quasar
12 - Quasar

Hi @SvenD,

 

If you put all your URLs through a download tool inside the iterative macro, then yes it will process them all at once. That being said, you can force it to process one at a time and evaluate against the limit after each one. This is more computationally expensive, but it can help prevent overage. I created a skeletal iterative macro that shows my idea for looping one at a time-- do you think this would help your use case?

 

To answer your other questions at the bottom of the previous post:

If you were using a batch macro, that would be exactly the execution process. Batch macros take one incoming line at a time, at least for the Control input. The download tool itself acts similarly: it processes each row individually. However, iterative macros behave differently. Iterative macros will evaluate all rows sequentially and together, then evaluate against a condition, then loop the rest of the records again. I wonder if your iterative macro isn't functioning like an iterative macro, especially if all your iteration numbers are 0. Sounds like your condition is causing the macro to act just like a standard macro, or something to that effect.

 

Is your API in Download or in Python (or something else)? If my attached skeleton doesn't help, could you please attach a sample of your macro?

SvenD
6 - Meteoroid

Hello @clmc9601 

 

Thank you so much for the reply.

 

I will have to check the iterative macro - it might be the case that I once added a control input and it now behaves like a batch macro.

 

It's time to knock-off here - I hope you have wonderful weekend and I'll check your workflow on Monday morning.

clmc9601
12 - Quasar
12 - Quasar

Hi @SvenD,

 

Yes, adding a control parameter automatically changes the workflow from an iterative macro to a batch macro. Only batch macros can contain control parameters.

 

Sounds good! Have a great weekend.

SvenD
6 - Meteoroid

Hello @clmc9601,

 

I use the Download tool. The batch-like nature of that tool is what caused all that confusion. All my debug message also triggered after each URL was processed by the Download tool (as I set up to send messages "Before Rows Where..."), leaving me to think that something wasn't right with the iterative macro.

 

My macro does indeed do 1 iterations (or 2 runs) (for the very few URLs that need to request a 2nd page) - and all of them are processed in the that first iteration as these were the only ones passing the filter to the iteration output.

 

Now that I have a better understanding of the iterative macro, I have looked at your great example.

 

I have used your idea of the 'First' and 'Skip First' row to only download one URL per iteration.

I also use the feedback from the download tool (the API returns remaining vs limit). There are probably better ways to do this, but for my example, I ended up adding a column that tracks limit issues. I then summarise over that column and update all rows with the maximum.

So in the next iteration I can filter all URLs out once a limit issue has been encountered.

 

I can cut that down to just the 'append field' tool, but I leave that as an exercise for the reader ;)

 

Regards,

 

Sven

 

“It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration” Edsger W.Dijkstra, 18 June 1975

clmc9601
12 - Quasar
12 - Quasar

I'm glad to hear you got it working! Thanks for the description of batch behavior of the Download tool-- I'm sure that will help future question-askers. I agree, the batch behavior of the Download tool was difficult for me to understand at first, too. Happy solving!

Labels