Page 1 of 2

DataStage Server Job - Performance Issue

Posted: Thu Oct 19, 2006 2:18 pm
by maneesahmed
Hello,

We are using Peoplesoft EPM delivered ETL jobs for loading data from PeopleSoft OLTP table to the Staging area table on Oracle 10g.

DataStage Version : 7.5
OS : HP-UNIX

Job follows CRC logic to load the data. Since this is the initial load there is nothing much happening on the CRC front.

Job Structure:

DRS(SoruceQuery) -> IPC -> Row Merge -> Transformer1 ->Transformer2->Row Split ->Transformer3-> IPC2 -> DRS

This particular job is suppose to load 4,00,000 rows. Initial 3-4 loads ran at 300 rows/sec which was acceptable.

Recently we have started facing issue wherein the job starts at 300 rows/sec and gradually the speed drops. Finally the job comes down to load 15 rows/sec and takes close to 8 hrs. We havent modified anything on the job. Not sure if there is anything on the environment side which could impact the performance.

Note: As the number of rows processed increases the speed keeps dropping with it.

Any suggestions/help in this regard will be appreciated.

Thanks,
Anees

Posted: Thu Oct 19, 2006 2:22 pm
by Krazykoolrohit
This happened to me in DB2 and the reason for me was that i was getting lots of warnings. The job took hours to write those warnings.

these warnings are because of load to the table... data issues..metadata mismatch.. try and find the log file.

Posted: Thu Oct 19, 2006 2:31 pm
by kris007
What is your Update action set to? Is it just inserts or Updates/Inserts?

Performance Issue

Posted: Thu Oct 19, 2006 2:32 pm
by maneesahmed
Krazykoolrohit wrote:This happened to me in DB2 and the reason for me was that i was getting lots of warnings. The job took hours to write those warnings.

these warnings are because of load to the table... data issues..metadata mismatch.. try and find the log file.
Thanks for the reply,

I checked the DSTG job log and found everything fine. There were no warnings or errors.

Posted: Thu Oct 19, 2006 2:34 pm
by maneesahmed
kris007 wrote:What is your Update action set to? Is it just inserts or Updates/Inserts?
It is set to "Insert New Rows or Update Existing"

Posted: Thu Oct 19, 2006 2:38 pm
by kris007
That is where your problem is. It's always better to seperate inserts and updates (via two seperate links) to load data into table rather than doing it through a single link. They usually(no..not usually..I would rather say always) take long(in fact very very long) times and even worse as the table grows larger. Split your updates and inserts in your job and your job will start flying. :wink:

Peformance

Posted: Thu Oct 19, 2006 2:41 pm
by maneesahmed
kris007 wrote:That is where your problem is. It's always better to seperate inserts and updates (via two seperate links) to load data into table rather than doing it through a single link. They usually(no..not usually..I would rather say always) take long(in fact very very long) times and even worse as the table grows larger. Split your updates and inserts in your job and your job will start flying. :wink:
Sure, That is something we will try out.
What is surprising me is the same job ran fine couple of times and suddenly started behaving in this manner.

Since we are using PeopleSoft delivered ETL jobs any customization is owned by the vendor. This is a issue we face when we modify anything on the delivered jobs.

Posted: Thu Oct 19, 2006 2:44 pm
by kris007
It ran fine for intial loads because, it was an initial load which is to say that your table was empty and all were inserts. But, as your table grows eventually, it will perform Insert else Update action for every row which is why it will take longer.

performance

Posted: Thu Oct 19, 2006 2:46 pm
by maneesahmed
kris007 wrote:It ran fine for intial loads because, it was an initial load which is to say that your table was empty and all were inserts. But, as your table grows eventually, it will perform Insert else Update action for every row which is why it will take longer.
Even the current load is a initial load as we have truncate the target table completely and reloading the data.

Re: performance

Posted: Thu Oct 19, 2006 2:51 pm
by dsscholar
Check the free space/current activities at unix. I faced this issue, the problem was that because there were many activities going on the unix box there was not much space left.

Re: performance

Posted: Thu Oct 19, 2006 2:51 pm
by gateleys
Upserts can take up a lot of time. Another issue that can be looked into is the various indices and other constraints enforced in the target. Drop them and perform your inserts and updates (separate jobs). You will definitely see an increase in the performance. Try playing around with the Array size and rows per transaction also.

gateleys

Re: performance

Posted: Thu Oct 19, 2006 2:58 pm
by dsscholar
Since nothing in the job/database design has changed and the performance has been dropped, not sure changing job/database will help.

10 G is not sufficient to process large amount of data.

Re: performance

Posted: Thu Oct 19, 2006 5:33 pm
by chulett
dsscholar wrote:10 G is not sufficient to process large amount of data.
Errr... ok. I'm sure the issue here is a little more specific than that, not just the fact that it is "10 G". :?

Posted: Thu Oct 19, 2006 9:47 pm
by DS_SUPPORT
I hope the Target lookup Hashed File would have exceeded the Size limit.

Posted: Thu Oct 19, 2006 10:02 pm
by kris007
DS_SUPPORT wrote:I hope the Target lookup Hashed File would have exceeded the Size limit.
Firstly, there is no mention of a Target Hashed file in the job design.
Secondly..Hope :shock:
Who would hope for such a ...... :roll: