Peformance issue in parellel job while writing to DB2 stage.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
RupeshVKolte
Participant
Posts: 1
Joined: Mon Sep 29, 2008 10:53 pm

Peformance issue in parellel job while writing to DB2 stage.

Post by RupeshVKolte »

Hi,

I am facing with problem analyzing the time taken by one job which has 2 lookups one sort, one aggregator and few transformer stage.
This job is taking 1hrs 30 min to complete (for processing 85000 rows). I have the following observations :
1> All the lookups and source query is taking less than 10 mins to fetch the data.
2> The job is taking 1hrs 30 min for processing 0 rows.(No rows were fetched from source and written to target)
3> If I run the same job by replacing the target table (DB2table ) with dataSet or sequential file than the job is running in 13 mins ( For processing same number of rows - 85000) and 7 mins for processing 0 rows.
4> In Insert / Update of target table all the fields used in where clause are indexed.

I am not able to figure out why this job is taking this much time to complete.

What may be the probable cause of such behavior??
What can be done to bring down run time of this job??

Thanks in advance....
Rupesh Kolte
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Welcome aboard.

Are you selecting from the same DB2 table (either in the stream or the lookups) that you are trying to upsert into? You may be experiencing locking problems in the database. Have your friendly DBA monitor locks while the job is running.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rajan.n
Premium Member
Premium Member
Posts: 96
Joined: Mon Oct 09, 2006 7:47 am

Post by rajan.n »

Hi Ray,
Could you please help in sharing some information ?

I have the similar senario now, I will have to do a Lookup ,Insert and update all on same table Irrespective of volume size ( Actually volume is 50k records), what will be the best design approach can we follow.

1) Can we do Lookup , Update , Insert all on one table and in one job ? If so what will be the pre steps we need to take care.
2) We can Look up the table and write Inserts and update saperatly to a Datasets and having 2 different jobs where one do Insert to table and other Update to table.


Can you please give me the pros and cons of both approaches too. Thanks.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This is not the same question. Therefore please begin a new thread.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rajan.n
Premium Member
Premium Member
Posts: 96
Joined: Mon Oct 09, 2006 7:47 am

Post by rajan.n »

Done , thank you
Post Reply