Hi,
I am taking an input from Common share where I have around 1000 txt files. some files has 100000 line records and some has 15000 and some has 0, like wise.. All inputs have common start names.. so i used filename_*.txt(attached screenshot)
but after getting around 200 or 250 files , i am getting an error Too many records..even to run this 200 file, its taking around 1 day.
Is there any way we could save file by file in any yxdb format or any loop formats? to fix this
Hi @Karthick461 ,
You might want to read the following discussion :-), especially @estherb47 's suggestion using a batch macro.
Might just work as it would batch process al files instead of reading them all at once :-).
If you drop some dummy .txt files i can make it for you if that helps.
Greetings,
Seb
Hi @Sebastiaandb ,
Attached couple of test files . Below is WF screenshot. Taking Input as Test1_*.txt and Select to replace field names and Multi field formula to remove # in the txt files
Hi @Karthick461 ,
Made a workflow for you with a batch macro in it, just make sure to select the right directory in the directory tool and you'll be fine. Let me know if this pulls all of your records in, i'm curious myself haha!
Greetings,
Seb
Thank you so much for your support on this.
I have changed the Batch input file as below. Is that fine? Once macro saved , I can change the directory as well. Right?
Hi @Karthick461 ,
Sure you can. Just make sure you insert the altered macro in your flow (and not the one with my paths in it haha ;-)).
Should work once you also changed the path in the directory tool in the workflow (not the macro).
high five!
Greetings,
Seb
Thanks much Seb. That was so helpful.
I have started my WF . Will let you know once done
Thanks again