Slow DS Server Job Performance

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
bkarth
Premium Member
Premium Member
Posts: 9
Joined: Wed Oct 26, 2005 2:04 pm

Slow DS Server Job Performance

Post by bkarth »

Hello,

We are having a problem; we developed DS Server jobs for our DW. While sequencing we decided to run 5 server jobs at the same time. If we do that the jobs are running VERY SLOW. If we run them as standalone they run fast. The concerns I have here is When I run all the 5 jobs the jobs slows down to grater extent and When I monitor resources like CPU, Memory, Disk I/O, Network Bandwidth on ETL box and SQL Box but nothing was alarmingly high (all of them were under utilized). So now the question is where is the bottleneck? Why when I run one job the jobs runs faster and when I run 5 they slow down dramatically? Is there any tuning we have to do in UVCONFG or OS or on the SERVER Hardware side?

Any input on this matter will be really helpful.

thanks,
KB
thumsup9
Charter Member
Charter Member
Posts: 168
Joined: Fri Feb 18, 2005 11:29 am

Post by thumsup9 »

Are the jobs running simultaneously in the sequence, because that could slow down the jobs. See if thats the same time if they are run one by one. I dont think UVCONFIG settings are a culprit here.
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Re: Slow DS Server Job Performance

Post by DeepakCorning »

Well Of course , if you run more jobs at the same time all the five are going to hit the CPU at the same time. I think some options like
In Process or inter processing may help you but more important is to come to a number of jobs that you will be able to run at the same time on the your kind of cpu without hogging it up.
bkarth
Premium Member
Premium Member
Posts: 9
Joined: Wed Oct 26, 2005 2:04 pm

Re: Slow DS Server Job Performance

Post by bkarth »

Well Of course , if you run more jobs at the same time all the five are going to hit the CPU at the same time --> If this argument is true then we should see the CPU is being used 100% or at least 50% (when we do the performance monitor).

When we run these jobs simultaneously, we don't find any bottleneck on the CPU (we have a 8way ETL box). Network, Disk I/O or even in memory.

We tried the IPC staged between transforms, inter process and in process options etc, still no improvement.

We do know running jobs simultaneously will slow down the performance but we don't expect them run 10 times slow.
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

Are you trying to use the same table for all 5 jobs?

Are you using a stage to extract and another to update tables in the same job?

Can you tell us the job design?
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

What is the database engine?

Post by jdmiceli »

Howdy!

What is the database engine for this? If it is SQL Server 2000, then you would need to give a parameter or something that would segment the data so each job isn't trying to process the same rows. Also, switch to user defined query mode and add 'WITH (NOLOCK)' after each table in the query (probably just one anyway). This will access the table without acquiring a hard lock and should speed throughput.

If it isn't SQL Server, see if there is similar functionality in whatever flavor you are on.

Hope that helps... :lol:

Bestest!
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
bkarth
Premium Member
Premium Member
Posts: 9
Joined: Wed Oct 26, 2005 2:04 pm

Post by bkarth »

Are you trying to use the same table for all 5 jobs?
--> No they access different Tables and they do different funcntions

Are you using a stage to extract and another to update tables in the same job?
The jobs will Read a STG table and Insert New or Update Existing on a Target tables in the SAME job

Can you tell us the job design?
These are simple Jobs, Read from STG tables and do Insert New or Update Existing Rows option using ODBC (The Database in SQL Server 2000)
bkarth
Premium Member
Premium Member
Posts: 9
Joined: Wed Oct 26, 2005 2:04 pm

Re: What is the database engine?

Post by bkarth »

jdmiceli wrote:Howdy!

What is the database engine for this?
The Database Engine is SQLServer 2000

If it is SQL Server 2000, then you would need to give a parameter or something that would segment the data so each job isn't trying to process the same rows.
I don't thinks we need this as all these jobs handle different tables and they don't access same tables or rows!

Bestest!
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Re: What is the database engine?

Post by DeepakCorning »

Thats quite strange. I used to have same problem (no CPU usage still bad performance) and I switched my settings for that project from inter process to in process and it helped. Now my CPU gets a good hit( not bad) and th eperformance have gone up !! Can not say whats the case with you jobs/server. Is your I/O also shows no usage??
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Split your jobs into two separate jobs. One is just database to a sequential file and then run it. You will see the performance pulling the data. Then, have the second job read that file and send the rows to the database. Now you'll see how fast rows can load using your given DML.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

bkarth wrote:Are you trying to use the same table for all 5 jobs?
--> No they access different Tables and they do different funcntions

Are you using a stage to extract and another to update tables in the same job?
The jobs will Read a STG table and Insert New or Update Existing on a Target tables in the SAME job

Can you tell us the job design?
These are simple Jobs, Read from STG tables and do Insert New or Update Existing Rows option using ODBC (The Database in SQL Server 2000)
splittng the job will work. are there no transformations done? why do you need datastage to do this data migration in the first place?

If you do want to do it through datastage, introducing a sequential stage between the two stages will help. In case you are bulk loading, check that you have no warnings.
Post Reply