I work with a small team of developers and we are looking to grow our team. With that, we want to properly scale our performance capabilities. Currently we run Alteryx Designer to process our data, which is then pushed out to Tableau Server for creation of our data visualizations. I do make use of the command line tool for a bit of automation. Most of our workflows run in under a couple of minutes on our laptops, a few take upwards of five minutes and one particularly large, complex workflow takes around six hours to complete.
In preparation of expanding our team, we are looking at potentially getting a server (or perhaps just a beastly desktop) to run our workflows before pushing them out to Tableau Server. We anticipate on developing more of those complex, six hour workflows, so we want to set up a computer that can really crunch through the data quickly. My questions are:
- Would it be best to get Alteryx Server to run our workflows or just continue running them via command line with Alteryx Designer? I know Alteryx Server is expensive and I'm not sure what it will benefit us versus what we're already doing.
- Does a server crunch the data any better than a desktop? I imagine a 12 core server is no different than a 12 core desktop, so at that point the only difference would be the OS.
- Should we run a physical server/desktop or a VM? I've heard that VMs do not run as fast as a physical machine. Something to do with vCPUs vs CPUs (the former being half the speed of the latter).