SQLLDR ignoring APT_ORACLE_LOAD_OPTIONS

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

SQLLDR ignoring APT_ORACLE_LOAD_OPTIONS

Post by pavankvk »

Hi,

I am using oracle load option to insert records in the table. Since my source can have duplicates/or target table is huge to check against input for existing records. Hence I am using ERRORS parameter of sqlldr to ignore the errors and continue to load good records.

The options I am passing are $APT_ORACLE_LOAD_OPTIONS=OPTIONS(PARALLEL=TRUE,DIRECT=FALSE,ERRORS=9999)

I am testing this job with only 1 record in source that is already in the target table and since errors is set at 9999, I expect the job to finish with out aborting. But the job fails on a

ORA-00001: unique constraint violated

This is what i see in log file
Number to load: ALL
Number to skip: 0
Errors allowed: 9999
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional


I dont understand why oracle fails even though errors is set at a high value.

Thoughts please...
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

This is what is reported in the datastage log

The call to sqlldr failed; the return code = 2;
please see the loader logfile: /export/home/dsadm/Ascential/DataStage/Projects/scratch/ora.6483.383524.6.log for details.

Since the return code is 2 is this a warning from sqlldr,but still datastage aborts it?
Kryt0n
Participant
Posts: 584
Joined: Wed Jun 22, 2005 7:28 pm

Post by Kryt0n »

It's clearly not ignoring your load options as your errors allowed is being set to what you specify. This setting is more likely to tell the loader how many records it should allow through before falling over. Have you got a reject link out of the Oracle stage?
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

Kryt0n wrote:It's clearly not ignoring your load options as your errors allowed is being set to what you specify. This setting is more likely to tell the loader how many records it should allow through before falling over. Have you got a reject link out of the Oracle stage?
As I mentioned in my first post,I am running that Job with only 1 input record in the source which is also present in the target table. and since errors is set at 999, the job should not abort.

For the actual job, i have specified the value as 99999999, got same unique constraint error, so testing out with 1 record in the source
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

Kryt0n wrote:It's clearly not ignoring your load options as your errors allowed is being set to what you specify. This setting is more likely to tell the loader how many records it should allow through before falling over. Have you got a reject link out of the Oracle stage?
NO, I dont have a reject link.Should I have one for using the Write method="Load"
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

I was right before,I cannot use a reject link with load option on oracle stage
Kryt0n
Participant
Posts: 584
Joined: Wed Jun 22, 2005 7:28 pm

Post by Kryt0n »

pavankvk wrote:
Kryt0n wrote:It's clearly not ignoring your load options as your errors allowed is being set to what you specify. This setting is more likely to tell the loader how many records it should allow through before falling over. Have you got a reject link out of the Oracle stage?
As I mentioned in my first post,I am running that Job with only 1 input record in the source which is also present in the target table. and since errors is set at 999, the job should not abort.

For the actual job, i have specified the value as 99999999, got same unique constraint error, so testing out with 1 record in the source
True about the reject link... forgot it isn't allowed for load option.

However, as already stated, this is more likely an instruction on how many errors sqlldr should accept, does not mean it should consider it a successful load... try adding a few valid records to your input after the invalid row... this should show sqlldr continues but the load is still a failure. Try setting errors allowed to 1 and have 1 or 2 invalid records and see what it does
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

Kryt0n wrote:
pavankvk wrote:
Kryt0n wrote:It's clearly not ignoring your load options as your errors allowed is being set to what you specify. This setting is more likely to tell the loader how many records it should allow through before falling over. Have you got a reject link out of the Oracle stage?
As I mentioned in my first post,I am running that Job with only 1 input record in the source which is also present in the target table. and since errors is set at 999, the job should not abort.

For the actual job, i have specified the value as 99999999, got same unique constraint error, so testing out with 1 record in the source
True about the reject link... forgot it isn't allowed for load option.

However, as already stated, this is more likely an instruction on how many errors sqlldr should accept, does not mean it should consider it a successful load... try adding a few valid records to your input after the invalid row... this should show sqlldr continues but the load is still a failure. Try setting errors allowed to 1 and have 1 or 2 invalid records and see what it does

What I did now is to delete that source record from the target table and rerun the job. Its able to properly insert. So its a valid record that should be inserted and when errors set to 999 on the second run, SQLLDR should ignore it and not fail with a Unique constraint error
mekrreddy
Participant
Posts: 88
Joined: Wed Oct 08, 2008 11:12 am

Post by mekrreddy »

Can you set errors = 1 and test with one record on source.
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

mekrreddy wrote:Can you set errors = 1 and test with one record on source.
It aborts with the same error. I see the the allowed errors in the ctl file as well as log, but still sqlldr seems to ignore that parameter
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Post by pavankvk »

The call to sqlldr failed; the return code = 2

If return code is 2(warning), why does the job abort? Shoould it not just throw a warning? Am i missing anything here?
HariK
Participant
Posts: 68
Joined: Thu May 17, 2007 1:50 am

Post by HariK »

Try adding the below two options to your Environment variable and check if job completes succesfully

SKIP_UNUSABLE_INDEXES=TRUE,SKIP_INDEX_MAINTENANCE=TRUE
Post Reply