What is the best way to load Huge Volume of History data?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
srini_ramesh
Participant
Posts: 13
Joined: Fri Oct 08, 2004 6:19 am

What is the best way to load Huge Volume of History data?

Post by srini_ramesh »

Hi,
What is the best method to load voluminous data into oracle tables using a parallel job in DS. What stages can i use that would be most helpful in fast loading?
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
you need to check what is faster.
options are:
1. prepare a bulk load data file and ctl file to run the oracle bulk loader from command line.
2. if your BD/table support parallel loading you can use the DS plugins to load the info in parallel.

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
Post Reply