Hello community,
I'm having a hard time sizing the Alteryx memory usage for a specific workflow in which is going to have 2B lines of data being prepared everyday.
I have seen everything in community talking about the sort/join memory utilization, but I would like to know if there is a way of estimating the peak memory usage in a workflow.
For testing, I've created a simple workflow as attached in order to understand the memory usage of Alteryx engine from the task manager but wasn't able to conclude anything. I know the memory usage may be related to the complexity of the workflow, but I would like to understand if there is any simple rule for it.
Are there any guidelines?
Thanks everyone.
Fernando Vizcaino
Solved! Go to Solution.
I'm not aware of a good way to estimate peak memory usage other than testing.The Alteryx engine will dynamically allocate memory and even overflow to disk disk space as virtual memory as necessary. The system/user settings only specify the starting quantity and the engine will manage it from there.
It sounds like you've been doing your homework on memory usage. I would just suggest to keep your eye on any "blocking tools" from Tara's Periodic Table of Alteryx Tools:
https://community.alteryx.com/t5/Engine-Works-Blog/The-Periodic-Table-of-Alteryx-tools/ba-p/64120
User | Count |
---|---|
18 | |
14 | |
13 | |
9 | |
8 |