Data caching - Commiting data batch wise

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
dsdeveloper123
Participant
Posts: 33
Joined: Sun Jun 24, 2007 9:46 pm

Data caching - Commiting data batch wise

Post by dsdeveloper123 »

Hi,

I have to insert close to 10lakh into oracle database. The datastage caches all 10 lakh records and then inserts and performs a commit ....

Now what I wanna do is that , the insertion should happen for very 1 lakh records and a COMMIT should be performed at this stage ...


Kindly let me know if there is any solution asap.

Cheers!
Mayur Dongaonkar
Participant
Posts: 20
Joined: Mon Dec 11, 2006 10:57 am
Location: Pune

Re: Data caching - Commiting data batch wise

Post by Mayur Dongaonkar »

You have to set arraysize = 1 L and commit size = 1 L option.
Mayur Dongaonkar.
dsdeveloper123
Participant
Posts: 33
Joined: Sun Jun 24, 2007 9:46 pm

Post by dsdeveloper123 »

Arraysize = 1 L and commit size = 1 L option or found only in Oracle OCI stage (i.e server jobs)

Can u please guide me how do I handle the same with server Jobs
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Make up your mind. You posted in the parallel forum with a job type marked as parallel. Now, you ask for a server answer. To get that, post your question in the server forum.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply