Sizing workflow memory usage
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello community,
I'm having a hard time sizing the Alteryx memory usage for a specific workflow in which is going to have 2B lines of data being prepared everyday.
I have seen everything in community talking about the sort/join memory utilization, but I would like to know if there is a way of estimating the peak memory usage in a workflow.
For testing, I've created a simple workflow as attached in order to understand the memory usage of Alteryx engine from the task manager but wasn't able to conclude anything. I know the memory usage may be related to the complexity of the workflow, but I would like to understand if there is any simple rule for it.
Are there any guidelines?
Thanks everyone.
Fernando Vizcaino
Solved! Go to Solution.
- Labels:
- Optimization
- Settings
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I'm not aware of a good way to estimate peak memory usage other than testing.The Alteryx engine will dynamically allocate memory and even overflow to disk disk space as virtual memory as necessary. The system/user settings only specify the starting quantity and the engine will manage it from there.
It sounds like you've been doing your homework on memory usage. I would just suggest to keep your eye on any "blocking tools" from Tara's Periodic Table of Alteryx Tools:
https://community.alteryx.com/t5/Engine-Works-Blog/The-Periodic-Table-of-Alteryx-tools/ba-p/64120
![](/skins/images/1A7F54316481E10DBCA4A87A32E06CC6/responsive_peak/images/icon_anonymous_message.png)