I have processed some data and wanted to insert it into the oracle table.
If the same workflow i run to write data into csv it is taking 30-40secs but to write into the oracle table it is taking lot of time.
No of records Time taking to write into oracle table
300 3:30 mins
1000 7:50 mins
10k 50mins
Properties used for work flow:
1)Append existing
2)"Output Data" to write data into the table.
I have also tried with bulk load but it is also taking approximately same time.
How can i minimise this time and Is it possible to write 100k records into oracle table within 5 mins?
Solved! Go to Solution.
Hi @Sudharshan_Sannathi ,
One thing you can try is "If you're using an Oracle ODBC, then try adjusting your Fetch Buffer Size in your ODBC manager. This basically changes the transaction size so making fewer, larger transactions may be beneficial". This information was taken from @CharlieS response to the following post.
See if that works for you.
Best,
Fernando Vizcaino
Hi Fernando,
Thanks for your response, i have just found the solution.
"Datatypes" was the major issue.
I have changed the datatypes in workflow that suits to the oracle DB - problem is solved now. Now it is taking 17secs, 2mins to load 100k and 1million records respectively.
Could you please explain more about data type suitable for Oracle? I also have the same problem as yours. I'm looking forward to hearing from you 🙂
I am getting the same issue of 1 hour to write 100k rows. Tried oracle bulk loader, does not work.
Can you provide information on the data types? We are having a problem writing to an Oracle table, and this may be the issue. What types are "good"?
^ need working datatypes (eye emojii)