I think this is an Database issue; Roll back segment problem arises when you are trying to commit large volumes of data, or Abort occured and you are trying roll back the data, contact your DBA and try to reduce your commit level
thanks
Error in loading the Target Table using Hash Lookups
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 13
- Joined: Thu Mar 17, 2005 1:20 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
This is NOT a DataStage problem. Nor does it have anything to do with the fact that you're performing lookups to hashed files.
You are trying to send too large a transaction to Oracle.
Possible solutions are to have you Oracle DBA enlarge the rollback segment so that it can handle the size of transaction you are sending, or to send smaller transactions (smaller rows per transaction).
The rollback segment keeps the "before image" records, in case you elect to roll back the transaction. Discuss with your Oracle DBA.
You are trying to send too large a transaction to Oracle.
Possible solutions are to have you Oracle DBA enlarge the rollback segment so that it can handle the size of transaction you are sending, or to send smaller transactions (smaller rows per transaction).
The rollback segment keeps the "before image" records, in case you elect to roll back the transaction. Discuss with your Oracle DBA.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 13
- Joined: Thu Mar 17, 2005 1:20 am
Hi,
Thanxs for the valuble suggestion. Please let me know whether including Link Partioner and Link Collector stages can this error can be over come.
or Is there any other way in creating a Server Job for the same that can split the huge volume of data and process the same and load it in to the data mart.
Regards
Pavithra
Thanxs for the valuble suggestion. Please let me know whether including Link Partioner and Link Collector stages can this error can be over come.
or Is there any other way in creating a Server Job for the same that can split the huge volume of data and process the same and load it in to the data mart.
Regards
Pavithra