Hi
How can I automatically read a large number of csv files (over 1000 files) with completely different columns and separators and union them then put them in a table?
Solved! Go to Solution.
Please see below
1- Build a batch macro:
2- use input directory to read all the files
Attached the workflow,
Hope this helps,
Regards,
Thank you very much for replying my question,
but I get this error after running the attached workflow and I don't know what is the problem:
Error: read (4): The Control Parameter "Control Parameter (4)" must be mapped to a field.
Please see below:
You have to click on the reversed question mark and then map the full path in the left side.
If this solves your issue please mark the answer as correct if you don't mind 🙂
Best Regards,
Exactly correct...
Thanks a lot.
Regards,
Excuse me, I have another question.
if Delimiters in files was different, what can I do for correct reading all of them?
for example some files splits using : some files splits using Tab etc ....
Solution 1:
Grouping files with the same separator in the same folder and then change the delimiter in the macro:
then you run the workflow and store the output in the temp file
After that you union all the temp output.
But it's really a manual process...
Solution 2:
Another way is you add the delimiter for each file:
then you do a join with the input directory:
after that you batch in the file and the delimiter
don't forget to map the second parameter in the main workflow:
Attached the workflow
Hope that helps,
Regards,
Thanks a lot,
Regards,
Excuse me, In solution 2 we need to insert all file's name and its Delimiter in list?
It,s so hard because the number of files is over 1000 files...
You may be interested in giving this macro a try: https://community.alteryx.com/t5/Engine-Works/The-Ultimate-Alteryx-Holiday-gift-of-2015-Read-All-Exc...
it has been built to accept many different file types and automatically union them
User | Count |
---|---|
17 | |
15 | |
15 | |
8 | |
6 |