Update then insert not working in 8.7
Moderators: chulett, rschirm, roy
Hmmm... hard to say based on what was posted. For a "combo" action to work the first action must fail before the second option will fire. So the update must update zero records before the insert is attempted. It sounds like that happens and then you get an PK violation on the insert, yes?
Your PK column, is it a surrogate key? If so, how are you handling it? And what is your "key" field for the update - one or more business key values?
Your PK column, is it a surrogate key? If so, how are you handling it? And what is your "key" field for the update - one or more business key values?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 117
- Joined: Wed Feb 06, 2013 9:24 am
- Location: Chennai,TN, India
Hey, you're the one that said you had a "single primary key (numeric) column in table".
So these duplicates... do have the duplicates in the file itself, or is there just a single instance of the key value in the file which may or may not exist in the table? The answer changes how you need to handle this.
So these duplicates... do have the duplicates in the file itself, or is there just a single instance of the key value in the file which may or may not exist in the table? The answer changes how you need to handle this.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 353
- Joined: Mon Jan 17, 2011 5:03 am
- Location: Mumbai, India
@akarsh
We have also experienced the same issue while updating data on our DB2 tables. Try one of these solution :-
1) Use Oracle Connector Partitioning instead of Hash Partitioning in the target.
OR
2) Reduce the "Array Size" to 1, keeping Record Count to higher value e.g 2000.
OR
3) Run your job on a single node.
We have also experienced the same issue while updating data on our DB2 tables. Try one of these solution :-
1) Use Oracle Connector Partitioning instead of Hash Partitioning in the target.
OR
2) Reduce the "Array Size" to 1, keeping Record Count to higher value e.g 2000.
OR
3) Run your job on a single node.
Thanx and Regards,
ETL User
ETL User
-
- Premium Member
- Posts: 353
- Joined: Mon Jan 17, 2011 5:03 am
- Location: Mumbai, India