Page 1 of 1

Loading Huge volume of Data into Oracle table

Posted: Tue May 29, 2012 2:17 pm
by goriparthi
Hi all,

we have a job that does a upsert , Job design is to join on XRF table and getting the surrogate key and then loading into oracle table.

Now the input data which comes through a set of flat files is about 180 million records (around 20 gb) and the retention capacity of target table is around 1.5 billion records.So once the table grows big how to handle splitting inserts and updates.

I want to know if someone has dealt with similar volumes and split the inserts and updates.

Any suggestions or approaches are appreciated.

Thanks

Posted: Wed May 30, 2012 12:50 am
by ray.wurlod
Do a lookup against the primary key of the target table.

If that's too big, first extract a single column table made up of those primary keys and do a lookup against that.

But it shouldn't be necessary to do that, because the lookup against the primary key should be resolved in its index.