The call to sqlldr failed; the return code = 2;

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
austin_316
Participant
Posts: 80
Joined: Fri Aug 21, 2009 7:49 am
Location: India

The call to sqlldr failed; the return code = 2;

Post by austin_316 »

hi,
we have etl job which will take data from a dataset and load into the table through oracle enterprise stage

the option in oracle enterprise stage is
Write Method=Load
Write Mode=Append
The table has unique constraint defined on two columns col1,col2

the issue we are facing now is, there is already a record present with the values C1 and C2 for the mentioned two columns as unique constraint. and the record is again coming from source with the same value C1 and C2.
When we used the load and append method the job is getting aborted with the error
TGT_INS,3: The call to sqlldr failed; the return code = 2;
please see the loader logfile: /IBM_DS/Datastage/InformationServer/Server/Scratch/ora.9109944.740917.3.log for details.
i have checked the data in the log file mentioned and the data in it is
SQL*Loader: Release 11.1.0.7.0 - Production on Mon Aug 1 17:13:01 2011

Copyright (c) 1982, 2007, Oracle. All rights reserved.

Control File: ora.7274830.660236.3.ctl
Data File: ora.7274830.660236.fifo.3
File processing option string: "FIX 135"
Bad File: ora.7274830.660236.3.log.bad
Discard File: none specified

(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional

Table schema1_tgt.ACCT_PKG_DTL, loaded from every logical record.
Insert option in effect for this table: APPEND

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
"col1" 1:20 20 PACKED DECIMAL (38, 0)
NULL if 20:20 = BLANKS
"col2" 21:39 19 DATE YYYY-MM-DD HH24:MI:SS
"col3" 40:103 64 CHARACTER
"col4" 104:135 32 CHARACTER

Record 1: Rejected - Error on table schema1_tgt.ACCT_PKG_DTL.
ORA-00001: unique constraint (schema1_tgt.JYU) violated


Table schema1_tgt.ACCT_PKG_DTL:
2 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 9216 bytes(64 rows)
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 3
Total logical records rejected: 1
Total logical records discarded: 0

Run began on Mon Aug 01 17:13:01 2011
Run ended on Mon Aug 01 17:13:01 2011

Elapsed time was: 00:00:00.53
CPU time was: 00:00:00.02
and data in the ctl file /IBM_DS/Datastage/InformationServer/Server/Scratch/ora.9109944.740917.3.ctl is
OPTIONS(DIRECT=FALSE,PARALLEL=TRUE)
LOAD DATA INFILE 'ora.7274830.660236.fifo.3' "FIX 135"
APPEND INTO TABLE schema1_tgt.ACCT_PKG_DTL
(
"col1" POSITION (1:20) decimal(38,0) NULLIF (20) = BLANKS ,
"col2" POSITION (21:39) date "YYYY-MM-DD HH24:MI:SS",
"col3" POSITION (40:103) char,
"col4" POSITION (104:135) char
)
we are not facing this issue if the records coming from source are different(unique constraints columns differing).

we have not mentioned anything for the index mode since we cannot rebuild the indexes(as duplicate might exist and index creation fails in that case) and the maintainance option can be set only for partition table.
please help is solving this issue.

thanks
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You can't use sqlldr to insert duplicate records, as you've found they must all be "new" for the bulk loader to work properly.
-craig

"You can never have too many knives" -- Logan Nine Fingers
austin_316
Participant
Posts: 80
Joined: Fri Aug 21, 2009 7:49 am
Location: India

The call to sqlldr failed; the return code = 2;

Post by austin_316 »

thanks craig
i guess we have to go with UPSERT mode only :roll:
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That or make sure you filter out any "duplicates" first.
-craig

"You can never have too many knives" -- Logan Nine Fingers
austin_316
Participant
Posts: 80
Joined: Fri Aug 21, 2009 7:49 am
Location: India

The call to sqlldr failed; the return code = 2;

Post by austin_316 »

hi,
sorry for reopening the topic.

when iam executing the job to load data into the table then job is aborting. there are no constraints on the table also.
Oracle_Enterprise_3,2: The call to sqlldr failed; the return code = 2;
please see the loader logfile: /data/IBM/Scratch_1/ora.5114.498752.2.log for details.
in the log file
SQL*Loader: Release 10.2.0.1.0 - Production on Tue Aug 2 15:04:40 2011

Copyright (c) 1982, 2005, Oracle. All rights reserved.

Control File: ora.5114.498752.2.ctl
Data File: ora.5114.498752.fifo.2
File processing option string: "FIX 135"
Bad File: ora.5114.498752.2.log.bad
Discard File: none specified

(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional

Table schema1_tgt.ACCT_PKG_DTL, loaded from every logical record.
Insert option in effect for this table: APPEND

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
"col1" 1:20 20 PACKED DECIMAL (38, 0)
NULL if 20:20 = BLANKS
"col2" 21:39 19 DATE YYYY-MM-DD HH24:MI:SS
"col3" 40:103 64 CHARACTER
"col4" 104:135 32 CHARACTER

Record 1: Rejected - Error on table schema1_tgt.ACCT_PKG_DTL, column "col4".
ORA-12899: value too large for column "schema1_tgt"."ACCT_PKG_DTL"."col4" (actual: 5, maximum: 4)


Table schema1_tgt.ACCT_PKG_DTL:
1 Row successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 9216 bytes(64 rows)
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 2
Total logical records rejected: 1
Total logical records discarded: 0

Run began on Tue Aug 02 15:04:40 2011
Run ended on Tue Aug 02 15:04:40 2011

Elapsed time was: 00:00:00.24
CPU time was: 00:00:00.02
as i can see the reason for aborting is the data coming in more than the size of the column. the record is not getting rejected but the job is aborting.

in the director client i can see
Oracle_Enterprise_3,0: Export complete; 2 records exported successfully, 0 rejected.
can someone tell me in what cases does the record gets rejected and job does not abort.
Post Reply