Job Creating Table Lock

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
cosec
Premium Member
Premium Member
Posts: 230
Joined: Tue May 08, 2007 8:10 pm

Job Creating Table Lock

Post by cosec »

I have two output links going to two separate Db2 stages but referring to the same table.

I have 21 records going through the transformer.

The Settings For the Db2 Stages are as follows:-

(1)For the Target Db2 stage 1 (which deletes):
Array Size = 1
Transaction size 1

(2)For the Target Db2 Stage 2 (which Inserts):
Array Size = 10
Transaction size 20

When I run...the table gets locked... I assume its to do with the commit. To what values should I set the Array size and Transaction size
in order for it run successfully without causing Table Locks ?
sachin1
Participant
Posts: 325
Joined: Wed May 30, 2007 7:42 am
Location: india

Re: Job Creating Table Lock

Post by sachin1 »

i have a similar case but it is for insert and update i have a values like Array Size : 1500 and Transaction Size: 50 which works fine for lakhs of records, you can try it out.
kris
Participant
Posts: 160
Joined: Tue Dec 09, 2003 2:45 pm
Location: virginia, usa

Re: Job Creating Table Lock

Post by kris »

Did you try 'Transaction size 1' for both Delete & Insert?
It will slow down the job but if your volumes are small, should be okay.

Otherwise you need to think about some other approach.
~Kris
cosec
Premium Member
Premium Member
Posts: 230
Joined: Tue May 08, 2007 8:10 pm

Re: Job Creating Table Lock

Post by cosec »

Yes Kris...tried it but then it inserts only one record
kris wrote:Did you try 'Transaction size 1' for both Delete & Insert?
It will slow down the job but if your volumes are small, should be okay.

Otherwise you need to think about some other approach.
abhi989
Participant
Posts: 28
Joined: Mon Sep 19, 2005 2:31 pm

Post by abhi989 »

one approach could be you send 1 set of records from transformer to the DB2 table and the other set of records you can dump it into a sequential file. Then take the file and process against the DB2 table in a second job.
Post Reply