Loading Huge volume of Data into Oracle table
Posted: Tue May 29, 2012 2:17 pm
Hi all,
we have a job that does a upsert , Job design is to join on XRF table and getting the surrogate key and then loading into oracle table.
Now the input data which comes through a set of flat files is about 180 million records (around 20 gb) and the retention capacity of target table is around 1.5 billion records.So once the table grows big how to handle splitting inserts and updates.
I want to know if someone has dealt with similar volumes and split the inserts and updates.
Any suggestions or approaches are appreciated.
Thanks
we have a job that does a upsert , Job design is to join on XRF table and getting the surrogate key and then loading into oracle table.
Now the input data which comes through a set of flat files is about 180 million records (around 20 gb) and the retention capacity of target table is around 1.5 billion records.So once the table grows big how to handle splitting inserts and updates.
I want to know if someone has dealt with similar volumes and split the inserts and updates.
Any suggestions or approaches are appreciated.
Thanks