Experts
Job Design
Sequential file with schema file, RCP loading into Oracle table using Oracle connector Stage. Record Count: 2000 and Array Size: 500, Number of columns loaded: 100 with many date fields and max varchar size is 100
Error
Ora_Load_Data,0: The OCI function OCIStmtExecute returned status -1. Error code: 24381, Error message: ORA-24381: error(s) in array DML. (CC_OraStatement::executeInsert, file CC_OraStatement.cpp, line 1818)
Previous posts
viewtopic.php?t=140714&highlight=24381
Understanding
1. We need to modify the Record Count and Array Size to resolve this fatal error
2. There are no standard values because each Oracle db and DS configurations are different
Question
Our jobs have been running in production env well until yesterday. Today we have a sudden spike in the number of records (from 2000 to 4000). The job failed. We plan to fix by changing Record Count and/or Array Size. Is there a way to predict these kind of failures?
Thanks in advance
DataStage Sterling
Oracle Connector error: ORA-24381
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 26
- Joined: Wed Jul 17, 2013 9:00 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 26
- Joined: Wed Jul 17, 2013 9:00 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Then it sounds like you need to base your changes on this - maybe a separate job with smaller arrays that is run at busy times (unless, of course, the array size can be a parameter - this will depend on which stage type you are using).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 26
- Joined: Wed Jul 17, 2013 9:00 am
Thanks Ray. I changed the array size to 50, 100000 but the job aborted with the same error. When I ran from our 7.5.1 environment (has Oracle Enterprise stage) , the job aborted too, however it loaded all except 4 records into the Oracle table. One column had special characters in these 4 records. It looks like the job is designed to load as many records as possible and abort (Very bad design IMO). In our new environment (8.7 with Oracle connector), job aborted and no data was loaded.
Special characters text
"M-bM-^@M-^S unable to get M-bM-^@M-"
I was able to replicate the same error in our development environment with Oracle connector stage. I have added a reject link to the oracle connector stage and to my surprise the job runs successfully now inserting the records with special characters into Oracle database.
When I used a Oracle Enterprise stage, the data is successfully loaded without any errors.
Both the jobs in 7.5.1 and 8.7 environments point to the same Oracle database.
Special characters text
"M-bM-^@M-^S unable to get M-bM-^@M-"
I was able to replicate the same error in our development environment with Oracle connector stage. I have added a reject link to the oracle connector stage and to my surprise the job runs successfully now inserting the records with special characters into Oracle database.
When I used a Oracle Enterprise stage, the data is successfully loaded without any errors.
Both the jobs in 7.5.1 and 8.7 environments point to the same Oracle database.
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI