Jobs simultaneously accessing the same DB2 table

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Jobs simultaneously accessing the same DB2 table

Post by HSBCdev »

I'm after some advice about using db2 stages.

I've got a sequence which eventually calls 6 jobs which all try to load (update or insert)data into the same DB2 table. I'm getting a lot of deadlocks and timeouts.

I've tried setting the transaction size to 1 which seems to help on one table that I use, but doesn't help when I do a similar thing on another table - I don't know why there should be a difference.

My question is...is it a bad idea to have these 6 jobs running at the same time? Should I change it so that 1 load file is created and only one job is run, or maybe I should make sure that the 6 separate jobs run sequentially.

Or.. is it just something I've not set up correctly which is causing me to get these deadlocks?

Thanks
neena
Participant
Posts: 90
Joined: Mon Mar 31, 2003 4:32 pm

Post by neena »

Hi,
This problem is not with your settings. DB2 behaves as the same way. You cannot update a single table with 6 jobs at one time.
You have to run the jobs sequentially. one after the other..
If you have same data structure from all the 6 jobs then write the outout to 6 sequential files and concat them to one and load one file to DB2( If this is poosible) .
Anydatabase will have lockes to the table if one process is working on it.. and other processes will be waiting for the release of the lockes when you run all jobs at one time..
Talk to you DB2 DBA about how DB2 behaves....

Thanks
Neena
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

Thanks Neena,


I'll concat the load file so one job can process it as you suggest.
Neil C
Participant
Posts: 46
Joined: Sun Apr 06, 2003 8:59 pm
Location: Auckland, New Zealand

Post by Neil C »

You could also ask your DB2 DBA if row level locking might help you. Generally a DB2 table would be created with a number of rows per block, and an update lock will lock the whole block. If the blocks are large, or the rows are short, then the possability of two updates hitting the same block would be high. Row level locking might help this. Or define the table with only one row per block. This would be OK if there are not too many rows involved, otherwise it might waste a lot of space in the DB2 table space.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

Thanks NeiL,

I think that having 1 row per block would waste too much space. For now I'm just running my sub-jobs sequentially if they access the same table. It is not too much of a performance problem on my current job.

Thanks all for your suggestions and explanations.
Post Reply