best practice to handle billion records
Posted: Mon Dec 15, 2008 5:11 pm
Hi ppl,
I am being questioned on datastage's loading capability.
I have a billion records in a file and i need to insert/update/delete a db2 table. And, adding to that i need to make it quick. Can you please suggest on what i can do to make it quickest possible with datastage.
I initially thought of bulk loader but it can be used only for inserts and not updates or deletes. I would be doing transformation but that would be very minimal and in most cases i would just have to feed the file to the database for inserts/updates/deletes.
Thanks for your help!
I am being questioned on datastage's loading capability.
I have a billion records in a file and i need to insert/update/delete a db2 table. And, adding to that i need to make it quick. Can you please suggest on what i can do to make it quickest possible with datastage.
I initially thought of bulk loader but it can be used only for inserts and not updates or deletes. I would be doing transformation but that would be very minimal and in most cases i would just have to feed the file to the database for inserts/updates/deletes.
Thanks for your help!