Slow DS Server Job Performance
Moderators: chulett, rschirm, roy
Slow DS Server Job Performance
Hello,
We are having a problem; we developed DS Server jobs for our DW. While sequencing we decided to run 5 server jobs at the same time. If we do that the jobs are running VERY SLOW. If we run them as standalone they run fast. The concerns I have here is When I run all the 5 jobs the jobs slows down to grater extent and When I monitor resources like CPU, Memory, Disk I/O, Network Bandwidth on ETL box and SQL Box but nothing was alarmingly high (all of them were under utilized). So now the question is where is the bottleneck? Why when I run one job the jobs runs faster and when I run 5 they slow down dramatically? Is there any tuning we have to do in UVCONFG or OS or on the SERVER Hardware side?
Any input on this matter will be really helpful.
thanks,
KB
We are having a problem; we developed DS Server jobs for our DW. While sequencing we decided to run 5 server jobs at the same time. If we do that the jobs are running VERY SLOW. If we run them as standalone they run fast. The concerns I have here is When I run all the 5 jobs the jobs slows down to grater extent and When I monitor resources like CPU, Memory, Disk I/O, Network Bandwidth on ETL box and SQL Box but nothing was alarmingly high (all of them were under utilized). So now the question is where is the bottleneck? Why when I run one job the jobs runs faster and when I run 5 they slow down dramatically? Is there any tuning we have to do in UVCONFG or OS or on the SERVER Hardware side?
Any input on this matter will be really helpful.
thanks,
KB
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
Re: Slow DS Server Job Performance
Well Of course , if you run more jobs at the same time all the five are going to hit the CPU at the same time. I think some options like
In Process or inter processing may help you but more important is to come to a number of jobs that you will be able to run at the same time on the your kind of cpu without hogging it up.
In Process or inter processing may help you but more important is to come to a number of jobs that you will be able to run at the same time on the your kind of cpu without hogging it up.
Re: Slow DS Server Job Performance
Well Of course , if you run more jobs at the same time all the five are going to hit the CPU at the same time --> If this argument is true then we should see the CPU is being used 100% or at least 50% (when we do the performance monitor).
When we run these jobs simultaneously, we don't find any bottleneck on the CPU (we have a 8way ETL box). Network, Disk I/O or even in memory.
We tried the IPC staged between transforms, inter process and in process options etc, still no improvement.
We do know running jobs simultaneously will slow down the performance but we don't expect them run 10 times slow.
When we run these jobs simultaneously, we don't find any bottleneck on the CPU (we have a 8way ETL box). Network, Disk I/O or even in memory.
We tried the IPC staged between transforms, inter process and in process options etc, still no improvement.
We do know running jobs simultaneously will slow down the performance but we don't expect them run 10 times slow.
-
- Charter Member
- Posts: 560
- Joined: Wed Jul 13, 2005 5:36 am
- Location: Ohio
What is the database engine?
Howdy!
What is the database engine for this? If it is SQL Server 2000, then you would need to give a parameter or something that would segment the data so each job isn't trying to process the same rows. Also, switch to user defined query mode and add 'WITH (NOLOCK)' after each table in the query (probably just one anyway). This will access the table without acquiring a hard lock and should speed throughput.
If it isn't SQL Server, see if there is similar functionality in whatever flavor you are on.
Hope that helps...
Bestest!
What is the database engine for this? If it is SQL Server 2000, then you would need to give a parameter or something that would segment the data so each job isn't trying to process the same rows. Also, switch to user defined query mode and add 'WITH (NOLOCK)' after each table in the query (probably just one anyway). This will access the table without acquiring a hard lock and should speed throughput.
If it isn't SQL Server, see if there is similar functionality in whatever flavor you are on.
Hope that helps...
Bestest!
Bestest!
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
Are you trying to use the same table for all 5 jobs?
--> No they access different Tables and they do different funcntions
Are you using a stage to extract and another to update tables in the same job?
The jobs will Read a STG table and Insert New or Update Existing on a Target tables in the SAME job
Can you tell us the job design?
These are simple Jobs, Read from STG tables and do Insert New or Update Existing Rows option using ODBC (The Database in SQL Server 2000)
--> No they access different Tables and they do different funcntions
Are you using a stage to extract and another to update tables in the same job?
The jobs will Read a STG table and Insert New or Update Existing on a Target tables in the SAME job
Can you tell us the job design?
These are simple Jobs, Read from STG tables and do Insert New or Update Existing Rows option using ODBC (The Database in SQL Server 2000)
Re: What is the database engine?
jdmiceli wrote:Howdy!
What is the database engine for this?
The Database Engine is SQLServer 2000
If it is SQL Server 2000, then you would need to give a parameter or something that would segment the data so each job isn't trying to process the same rows.
I don't thinks we need this as all these jobs handle different tables and they don't access same tables or rows!
Bestest!
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
Re: What is the database engine?
Thats quite strange. I used to have same problem (no CPU usage still bad performance) and I switched my settings for that project from inter process to in process and it helped. Now my CPU gets a good hit( not bad) and th eperformance have gone up !! Can not say whats the case with you jobs/server. Is your I/O also shows no usage??
Split your jobs into two separate jobs. One is just database to a sequential file and then run it. You will see the performance pulling the data. Then, have the second job read that file and send the rows to the database. Now you'll see how fast rows can load using your given DML.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Charter Member
- Posts: 560
- Joined: Wed Jul 13, 2005 5:36 am
- Location: Ohio
splittng the job will work. are there no transformations done? why do you need datastage to do this data migration in the first place?bkarth wrote:Are you trying to use the same table for all 5 jobs?
--> No they access different Tables and they do different funcntions
Are you using a stage to extract and another to update tables in the same job?
The jobs will Read a STG table and Insert New or Update Existing on a Target tables in the SAME job
Can you tell us the job design?
These are simple Jobs, Read from STG tables and do Insert New or Update Existing Rows option using ODBC (The Database in SQL Server 2000)
If you do want to do it through datastage, introducing a sequential stage between the two stages will help. In case you are bulk loading, check that you have no warnings.