This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I'm having a hard time sizing the Alteryx memory usage for a specific workflow in which is going to have 2B lines of data being prepared everyday.
I have seen everything in community talking about the sort/join memory utilization, but I would like to know if there is a way of estimating the peak memory usage in a workflow.
For testing, I've created a simple workflow as attached in order to understand the memory usage of Alteryx engine from the task manager but wasn't able to conclude anything. I know the memory usage may be related to the complexity of the workflow, but I would like to understand if there is any simple rule for it.
I'm not aware of a good way to estimate peak memory usage other than testing.The Alteryx engine will dynamically allocate memory and even overflow to disk disk space as virtual memory as necessary. The system/user settings only specify the starting quantity and the engine will manage it from there.
It sounds like you've been doing your homework on memory usage. I would just suggest to keep your eye on any "blocking tools" from Tara's Periodic Table of Alteryx Tools: