Error in loading the Target Table using Hash Lookups

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ranga1970
Participant
Posts: 141
Joined: Thu Nov 04, 2004 3:29 pm
Location: Hyderabad

Post by ranga1970 »

I think this is an Database issue; Roll back segment problem arises when you are trying to commit large volumes of data, or Abort occured and you are trying roll back the data, contact your DBA and try to reduce your commit level

thanks
RRCHINTALA
pavithra_12
Participant
Posts: 13
Joined: Thu Mar 17, 2005 1:20 am

Post by pavithra_12 »

Hi,

we have also set the reduced the commit level but still the problem persist. Is there any way to design the server job to overcome this error.

Regards
Pavithra
ranga1970
Participant
Posts: 141
Joined: Thu Nov 04, 2004 3:29 pm
Location: Hyderabad

Post by ranga1970 »

whats the commit level?
How about asking your DBA to Increase the RBS?
Whats the record size?
RRCHINTALA
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This is NOT a DataStage problem. Nor does it have anything to do with the fact that you're performing lookups to hashed files.

You are trying to send too large a transaction to Oracle.

Possible solutions are to have you Oracle DBA enlarge the rollback segment so that it can handle the size of transaction you are sending, or to send smaller transactions (smaller rows per transaction).

The rollback segment keeps the "before image" records, in case you elect to roll back the transaction. Discuss with your Oracle DBA.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
pavithra_12
Participant
Posts: 13
Joined: Thu Mar 17, 2005 1:20 am

Post by pavithra_12 »

Hi,

Thanxs for the valuble suggestion. Please let me know whether including Link Partioner and Link Collector stages can this error can be over come.
or Is there any other way in creating a Server Job for the same that can split the huge volume of data and process the same and load it in to the data mart.

Regards
Pavithra
Post Reply