Solved! Go to Solution.
@Sashankbongu are you reading multiple files via the Dynamic Input? Does all the data follow the same template ie. columns?
No @RishiK its just one input file that connects to Dynamic Input (Netezza) .
A better screenshot added
@Sashankbongu thanks for that.
You can turn on Performance Profiling on your workflow to get a better understanding of the bottle necks.
In the Configuration pane, select the Runtime tab and tick the option Enable Performance Profiling.
You can then see what the expensive blocks are in your process.
I'd also advise that you try the in-db blocks to connect to Netezza rather than the Dynamic Input - looks like the performance issues are on the Netezza side. With in-db all of the processing overhead is on the db server.
Thanks @RishiK ,
you are right the processing issues are on Netezza it took 99.9% of the time.
I will try using in-db for input.
Have not used these before.
Thanks,
Sashank
@Sashankbongu I thought that might be the case. If things improve feel free to mark my suggestion above as a solution.
Hi @RishiK , I later fount that I can increase the " IN " clause limit (Under where clause) from default 1000 so more records are queried at once.