Improving performance
Moderators: chulett, rschirm, roy
Improving performance
Hi
I am having a job which loads a table. For initial load there were around 35 million records.
I loaded them and it tooka round 9 hours. Like this I am having 7 tables
NOw I am doing an incremental load for three months and it is taking more time than the initial load.
My option int he target DRS is Insert New Rows or Update Existing.
I am having indexes also on this table..
Let me know your ideas on how to improve performance of incremntal load
I am having a job which loads a table. For initial load there were around 35 million records.
I loaded them and it tooka round 9 hours. Like this I am having 7 tables
NOw I am doing an incremental load for three months and it is taking more time than the initial load.
My option int he target DRS is Insert New Rows or Update Existing.
I am having indexes also on this table..
Let me know your ideas on how to improve performance of incremntal load
Pradeep Kumar
-
- Charter Member
- Posts: 560
- Joined: Wed Jul 13, 2005 5:36 am
- Location: Ohio
Re: Improving performance
For a start, your problem is right there. Split your inserts and updates. And for millions of records, use the bulk loader.pradkumar wrote:My option int he target DRS is Insert New Rows or Update Existing.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
There were no warnings. I am working on Oracle Database.(Source and Target).. there are no referential conditions
By the way this job is runing in development. It is not yet moved to production. Hopefully it should take less time in PRD...
Correct me if I am wrong and let me know if this performance can be improved too
Incremental===> 3months data
By the way this job is runing in development. It is not yet moved to production. Hopefully it should take less time in PRD...
Correct me if I am wrong and let me know if this performance can be improved too
Incremental===> 3months data
Pradeep Kumar
But not as much as doint UPSERTS.
Alternatively, load your file into a temp table. Do a sql join and get the inserts. Similarly you can get updates as well. Do bulk inserts. For updates, well you can pass it through DRS stage or OCI stage and do updates only. Will be much faster.
Alternatively, load your file into a temp table. Do a sql join and get the inserts. Similarly you can get updates as well. Do bulk inserts. For updates, well you can pass it through DRS stage or OCI stage and do updates only. Will be much faster.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
So is it better to use only Insert New Rows only..in Update ActionDSguru2B wrote:You are correct and hence its taking a lot of time. Also it forces your array size to be set to 1 and thats another downfall.
Moreover downt he line we will be doing incremental load for every two days..Even then using Inser and Update will pose a problem
Pradeep Kumar
Partitioning of database tables should help incase of Large volume of data .Contact your DBA .
please read this article http://www.oracle.com/technology/produc ... oning.html
paddu
please read this article http://www.oracle.com/technology/produc ... oning.html
paddu
Yes, 110% better.pradkumar wrote: So is it better to use only Insert New Rows only..in Update Action
Moreover downt he line we will be doing incremental load for every two days..Even then using Inser and Update will pose a problem
Yes.
And if your DBA is anything like mine, he will laugh as the fact that you are asking him/her to make a database partitioned for 30 million records. For them this is considered an average amount. They consider partitioning for billions of records. But maybe thats just my DBA.
![Wink :wink:](./images/smilies/icon_wink.gif)
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.