Duplicate record error

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Roopanwita
Participant
Posts: 125
Joined: Mon Sep 11, 2006 4:22 am
Location: India

Duplicate record error

Post by Roopanwita »

Hi,
I have a job which loads data from sequential file to target table(no intermediate stage in between and also no processing). My job gets aborted because of bad data(Character was coming in Interger field).In table I have set Transaction Size as 1 and bad data is suppose 50th record.My question is that,will the first 49 records be written in table.In target I have insert only option.

In case if those record are written,in next run it will give duplicate recod error(after fixing data when I run the job again).Without any manual process (like deleting records from target table,dropping index) how I can load remaing record.

I am not suppose to change job design.

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This is another example of a silly requirement. You are supposed to changed the way a program works, without changing the program? Do they issue you with a magic wand?

You are right that, depending on the load method being used, the already-loaded rows ought to be rejected by the database, and therefore only those source rows not already loaded ought to be loaded in the later run, but this can be thwarted by a number of factors, such as your warnings limit.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply