Concurrent running job use 95% CPU but less than 30% memory

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Hongqing Tang
Premium Member
Premium Member
Posts: 12
Joined: Thu Apr 27, 2006 3:17 pm
Location: Bay Area

Concurrent running job use 95% CPU but less than 30% memory

Post by Hongqing Tang »

I am running jobs concurrently. At one point of time, there are 36 jobs running concurrently. We saw the CPU shot up to 95% utilization but memory only used less than 30%. Those are just read and then write out jobs, should not consume a lot of CPUs. Is there a set of environment parameters that we can adjust to make the jobs to utilize more memories?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Like any batch process, a DataStage job will consume as much CPU as it can to get the job done as quickly as possible. Since streaming I/O is used wherever possible, there are no waits on I/O to preclude use of the CPU. That you aren't using much memory means exactly that you aren't using much memory - your job does not have any memory-intensive activitiy occurring.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply