how to upsert an oracle table with huge amount of data?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
donhoff
Participant
Posts: 8
Joined: Fri Sep 07, 2007 9:58 pm

how to upsert an oracle table with huge amount of data?

Post by donhoff »

Hi, I am upserting a oracle table. But the data volume is very huge.

The incremental data was 100Million records, 50% are update and 50% are insert.

Now I tried to bulk load it into a tmp table, then I used the merge sql. But the merge sql runs about 7 hours long.

Is there any way that I could upsert the data fastly?

Thanks!
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That's about 4000 rows/second which, if you believe this is a valid metric, isn't all that bad.

You could try separating the operations. Use direct write (bulk load) for the inserts, then upsert (with "Update Only" as the rule) for the updates.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kwwilliams
Participant
Posts: 437
Joined: Fri Oct 21, 2005 10:00 pm

Post by kwwilliams »

If you go to asktom.oracle.com, Mr. Kyte would tell you that instead of performing an upsert, you should be dropping and rebuilding the table using loader. 100 million row operations are going to take some time to finish, not matter which tool you may have performing them.
Post Reply