Hi All,
I have an Input data that produce an output of over 200K. I developed a Alteryx workflow which firstly consumes a lot of time(almost 1hr) and then provides an Output of about 140MB File.
The input is actually a ODBC query, but here for the security purpose, I have put an Dummy excel file.
Please help me to get a solution.
Thanks in advance.
Yes, i agree with you, excel is not designed to be a database, but i understand that the root cause of the problem here is not the use of excel as output. For me, the root cause is trying to use the report tools with this volume of rows.
I can write 400.000 rows vs 36 columns (without formatting) with data in ~17 seconds here.
For 400k rows vs 36 columns i got
Workflow time: ~17s
File size: ~40MB
Opening time: ~20s
For 200k rows vs 36 columns
Workflow time: ~9s
File size: ~21MB
Opening time: ~9s
Hi Felipe,
Thanks for your suggestion. I tried using output tool and it worked. But I need to connect output tool with the action tool as I need to transfer the file dynamically to a certain folder. Is it possible?
Please share the workflow.
Regards,
Jayan
1) If you want to save the output of your workflow into a folder that already exists, you can use the formula tool + output tool with the proper configuration. Here you can see exactly how (with workflows): Solved: Re: Output multiple files with multiple sheets bas... - Alteryx Community
2)If you need to move files that already exists or create new folders to put files inside, its a little more tricky. Here is how (with workflows): Solved: Re: Save a file to a dynamic Folder - Alteryx Community
Hi All,
I have applied the workflow with Output tool, But still I need a way to format the data in Excel. The end user will not feel comfortable if the file is not formatted properly. I find this is the only limitation of Aletryx.
Hi,
I am actually looking for a workflow that will help me to read and write a data of about 200K in a specific Excel format.
Regards,
Hi @Jayan_alteryx - you have another requirement too - no? That your file not be 50mb? I think part of the issue here is the last request makes this task borderline impossible.
User | Count |
---|---|
106 | |
82 | |
70 | |
54 | |
40 |