Insert issue with large volume of data

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
VijayDS
Participant
Posts: 38
Joined: Thu Jun 18, 2009 3:50 am

Insert issue with large volume of data

Post by VijayDS »

Hi All,

I have a Insert job which has to load more than 3,00,000 records but it's loading 2,20,000 records and then the job getting aborted and giving the error message like below.

In Oracle Enterprise stage selected User defined update and specified insert statement.

ORA_TEReport_Insert,0: Unable to use a record for update.

I have inserted one record manually after getting 2,20,000 records being loaded with Datastage and no issue with the manual insertion.

Verified the source data also but I didn't find any issue.

Is it like space issue with the table? or issue with the data? or some other reason the job getting aborted?

Please suggest me where the issue is if you faced this issue earlier or you know?


Thanks
Vijay
Thanks
Vijay
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

First suggestion is always going to be to search and since you didn't mention doing that - see if an Exact Match search for "Unable to use a record for update" helps at all.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kogads
Premium Member
Premium Member
Posts: 74
Joined: Fri Jun 05, 2009 5:36 pm

Post by kogads »

Check the update statement. Also, see whether you are getting any nulls in the key fields..
Post Reply