HiHow can I automatically read a large number of csv files (over 1000 files) with completely different columns and separators and union them then put them in a table?
@Saraabdi955,
Please see below
1- Build a batch macro:
2- use input directory to read all the files
Attached the workflow,
Hope this helps,
Regards,
Please see below:
You have to click on the reversed question mark and then map the full path in the left side.
If this solves your issue please mark the answer as correct if you don't mind 🙂
Best Regards,
Solution 1:
Grouping files with the same separator in the same folder and then change the delimiter in the macro:
then you run the workflow and store the output in the temp file
After that you union all the temp output.
But it's really a manual process...
Solution 2:
Another way is you add the delimiter for each file:
then you do a join with the input directory:
after that you batch in the file and the delimiter
don't forget to map the second parameter in the main workflow:
Attached the workflow
Hope that helps,
Have you selected the option below in the macro:
I have tested with 3 files with different schema and it works for me.
Hope that helps!
Thank you very much for replying my question,
but I get this error after running the attached workflow and I don't know what is the problem:
Error: read (4): The Control Parameter "Control Parameter (4)" must be mapped to a field.
Excuse me, I have another question.
if Delimiters in files was different, what can I do for correct reading all of them?
for example some files splits using : some files splits using Tab etc ....