Hi
We have a problem with Alteryx workflow runtime
My team preforms ad hoc data analysis,
when the user ask for a specific analysis for one time only or at a very low frequency (let’s say once a year).
Usually we received the gross data on txt or csv files.
Our current solution is Alteryx designer which installed on the team laptops.
The problem is that the data is keep growing that 5 millions records is not that rare.
Since the dataset volume is bit, the workflow runtime is long. There are cases we should let the workflow run overnight.
We convert the csv/txt gross data to Alteryx Database file format. It helps but it will still need to run overnight or a few hours.
We are during the process of onboarding Alteryx server,but i don’t think it will solve the runtime problem. We will have to create the Workflow, test it, run the entire flow and search for bugs etc.
All done locally and then migrate it to prod. All for one time and we will still need to run the workflow locally several times while testing is.
Is this correct or I misunderstand the of Alteryx server concept?
I thought about uploading the gross data into Azure DB or other cloud DB, and import the data to the workflow from the DB and not from the csv/txt/ alteryx DB file.
Will it decrease the running time?
I am looking for an Alteryx based solution which the team will create the workflow, click the run button and see the results in minutes.
Any other ideas to solve the runtime problem?
thank you, Tal