Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

System Requirements to Run on Local Machine (Quickly/Large Datasets)

LindonB
8 - Asteroid

I use Alteryx at work. The local machine isn't spec'ed very highly, but I'm using an HVD. There, I can process very larger data sets of multiple gigs quickly

 

I'm also attempting to do some data cleaning on my personal computer for educational research. It is running painfully slow in some places of my workflow, despite having relatively small data sets of maybe 20 MB. Interestingly..

 

1. The slowness seems to occur in some places of a workflow and not other despite the data sets being roughly the same size. 

2. The entire platform is more frequently locking up and the software needing to be closed.

 

My local machine has the following specs, and I also am willing to upgrade key components or swap out with a Ryzen 3 build. Can anyone maybe advise on where the bottleneck is likely occurring? Nothing seems to be maxing out, and I'm assuming that Alteryx's computations are more CPU consuming than GPU.

 

CPU: i5-9600K (6 Core; 6 Threads)

GPU: RTX 2060 SUPER

RAM: 16 GB V-Color Skywalker DDR4 SDRAM 3000 (15-16-16-35) - Reaching about 9GB Load

SSD: Crucial NVMe SSD with about 70 GB free where Alteryx is stored. Data and outputs are off second/identifical SSD with 900 GB free.

 

I did come across this article, which seems to suggest that the largest improvement would be made with a CPU upgrade.

https://community.alteryx.com/t5/Engine-Works/Hardware-Matters/ba-p/424033#

 

4 REPLIES 4
AkimasaKajitani
17 - Castor
17 - Castor

How much data is in your workflow? (You can see as inside the red box)

If it is very large, you might change the memory limit setting. 

 

AkimasaKajitani_0-1606883218158.png

You can change Memory Limit at User setting or workflow configuration or system setting.

 

AkimasaKajitani_1-1606883356415.png

 

AkimasaKajitani_2-1606883433442.png

 

I think the computer specs are good enough.

If the data is over the memory, I recommend that you might rebuild the workflow by batch macro to shrink the data size at processing the same time or might add the memory.

 

LindonB
8 - Asteroid

Thank you for the reply. Increasing the max memory consumption did help. Going to reply to the thread generally with some more notes.

LindonB
8 - Asteroid

Some General Findings:

 

Increasing the "Default Dedicated Sort/Join Memory Usage" did help. I doubled my system ram to 32 GB and increased the max usage from about 4 GB to about 16GB. Benchmarking in CAM shows that when running the workflow, I now consume around 8 GB routinely with spikes to 16 GB occasionally. It runs MUCH faster now. Obviously these figures are based on my personal workflow and data, but it's an interesting finding.

AkimasaKajitani
17 - Castor
17 - Castor

I'm glad it's resolved.

 

I think it was slowed down by some tools that consume a lot of memory.

Labels