Showing results for 
Search instead for 
Did you mean: 

Alteryx designer Discussions

Find answers, ask questions, and share expertise about Alteryx Designer.

Iterative Macro Help


Alteryx beginner here...I have about 1.6 million records in my dataset and for each one, I need to generate a random number, add a couple of calculated fields, and summarize the data based on a few different fields.  Then, I need to run all 1.6 million records again and again, creating a new random number and again adding on the calculated fields and summarizing.  I will need to run 10,000 simulations.  I am currently using an iterative macro to perform this task, and it is running very slowly.  I ran through all 10,000 simulations last night and it looked like there was 3.3 TB of data in macro step...I guess I'm wondering if there's a more efficient way of doing this. 


Below is the macro workflow.  It's very simple.  I read in the data, calculate my random number and other fields, and then I summarize that particular run and spit it out.  The same data that's read in is then fed back into the macro.  My maximum iterations is set at 10,000 to achieve my 10,000 simulations.  Any ideas as to what might be causing Alteryx to hold so much data in the memory?  I'm having some trouble attaching my workflows, so I hope this summary is helpful enough.


Alteryx Certified Partner
Alteryx Certified Partner

Hi @DanielLipsey 


I don't know how complex your data looks like and what you're doing with your Formula Tool, so it's hard to tell. Maybe you don't even need a macro for this task, or a Batch Macro would do the trick.

Unless your Output has an use for the next iteration, I don't see an iterative macro as a great solution in your case.


Everytime you run your macro the data calculations inside each tool of the macro are stored in the temp folder. When you run the same data over and over with new calculations it stores both current and previous calculations. And when there isn't enough space to use your RAM memory then it uses your hard drive. What are your calculations in the summary tool?

Going to pretty heavy going at that scale but I would suggest using generate rows to generate 10,000 rows


You can then use Append Fields to replicate the data set and then apply summarise. I think with a little work it would do this in a fairly memory efficient basis.


I answered similar problem for Monte Carlo simulation:


If you could specify more details of random and sum then happy to put together a fuller sample



Sorry for the late reply...I was able to adapt this workflow for my problem and it worked pretty well.  I still had to run it in four separate batches of 2,500, but it ended up running in a reasonable amount of time compared to my macro approach.


Thank you for the help, especially since all I was able to provide was a screenshot of my workflow and not the actual file.  




You're right, based on @jdunkerley79 's reply, I don't think a macro is the best approach.  Still trying to wrap my ahead around macros and what their best uses are.  It sounds like, based on what you're saying, that since my macro output that goes back into my iteration input doesn't change, there's no reason to use a macro.  Thanks for the reply!




Yeah, I could tell I was having memory problems based on how much data was showing in my workflow at the end.  Thanks for the reply...this is helpful in understanding how Alteryx is using memory.