Page 1 of 1

Error during loading rows in DB2 database

Posted: Fri Sep 26, 2008 2:12 pm
by dsscholar
Hello,

I am trying to load a file with 6000 records in DB2. The job loads 48 rows out of 6000 and throws the following warning for remaining :

[IBM][CLI Driver][DB2] SQL0803N One or more values in the INSERT statement, UPDATE statement, or foreign key update caused by a DELETE statement are not valid because the primary key, unique constraint or unique index identified by "" constrains table "" from having duplicate rows for those columns. SQLSTATE=23505

If I re-run the same file, again 48 rows are loaded making the total count as 96, so I am sure it is not duplicate records thats causing the issue because the same records are being loaded again.

I checked there are no foreign keys defined for this table, so referential intergrity cant be the reason.

Please advise.

Also, I want to rollback in case even one of the rows is rejected. I have set the transaction size as 0, still I see 48 rows loaded in the DB and warnings for the rest. Please let me know if I need to do anything extra.

Thanks in advance !

Posted: Fri Sep 26, 2008 2:34 pm
by chulett
No clue about your DB2 specifics, but understand that a job must abort for any rollback of uncommitted records to happen.

Posted: Tue Oct 07, 2008 8:31 pm
by abc123
This might happen if a DELETE or UPDATE was done on the table and commit wasn't done.

Posted: Thu Dec 04, 2008 4:27 pm
by dscon9128
Check your unique key constraints in your destination table.

Posted: Thu Dec 04, 2008 4:28 pm
by dscon9128
Check your unique key constraints in your destination table.

Re: Error during loading rows in DB2 database

Posted: Thu Dec 04, 2008 11:07 pm
by infranik
Try to make a workaround job and insert a sort and remove-duplicates stage before the final o/p.(keys for sort and remove dupes should be same.) In this workaround, remove the DB2 stage and instead get all the records in a dataset. Then you can compare the rows from the job monitor to the original job. try to load the table from the dataset created in this workaround job.
I believe this warning occurs because definately you are having some duplicates in the source.