Datastage Jobs Not Running

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

do some load balancing and don't overload the server. You can also ask IBM to get a patch for increasing the timeout defined for server. Keep the &PH& folder clean and don't let logfiles grow too much.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
paranoid
Premium Member
Premium Member
Posts: 185
Joined: Tue May 29, 2007 5:50 am

Post by paranoid »

Hi,

Thanks for the reply. I would contact IBM on this as you suggested.
Does clearing DS logs in the director clears the files in &PH& automatically? Or the files in that directory needs to be removed manually? Sorry for this stupid question.

Sue
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Removing or purging DS logs is different from &PH&.

Logs are stored in RT_LOG hashed files and are related to the specific job's logs.

&PH& stores the phantom process history - more of kernel level. This is one central area where your process communication occurs.

To clear, log into DS Admin and select the project you wish to clear and run the command

Code: Select all

CLEAR.FILE &PH&
Clearing &PH& will assist DataStage to 'turnaround' better.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

:!: Most people set up a cron script to prune that directory, i.e. delete files older than x days to keep a rolling x days in the phantom directory. It's important to note that CLEAR.FILE is like a truncate so if you take that route you should only do that when no jobs are running.

And the solution to your -14 error is to stop overloading your server. Sometimes that's not about how many are running but rather it means don't attempt to start so many at the same time - stagger out their start a little bit. While there (allegedly) is a "patch" for this to increase the timeout value, it's not something IBM just gives anyone that asks, you have to make a pretty compelling business case for this from what I understand.
-craig

"You can never have too many knives" -- Logan Nine Fingers
vivekgadwal
Premium Member
Premium Member
Posts: 457
Joined: Tue Sep 25, 2007 4:05 pm

Post by vivekgadwal »

paranoid wrote:The jobs failed today as well and finally i could find the error code when running manually on the server. It says "Status code = -14 DSJE_TIMEOUT".
Sue,

This was exactly our problem too. Are the jobs getting fired at all later? I mean, if you try to run them manually, do they run? Or they just keep quiet, not logging anything and stay as they were before?
Vivek Gadwal

Experience is what you get when you didn't get what you wanted
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

paranoid wrote: I would contact IBM on this as you suggested.
I would first suggest to clean &PH& folder and old job logs then check it. Also monitor the resource usage after you trigger the jobs, itf its taking more than 90-95% CPU , try to reduce the number of jobs running concurrently else including other error messages, performance will be hit. IMO
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
paranoid
Premium Member
Premium Member
Posts: 185
Joined: Tue May 29, 2007 5:50 am

Post by paranoid »

Thanks all. Can i clear all of the files in &PH& folder? Will there be any issue with the existing jobs if i delete all the files in it?

If we need to clear that folder, should we need to make sure that no job is running at that time?

@ Vivek -- Yes after some time when we ran them manually, they are running fine.

Sue
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Yes - you can clear all files from &PH&. The prescribed method is to clear from TCL.

Yes - you must ensure that none of your jobs in that project are running when you do this.
paranoid
Premium Member
Premium Member
Posts: 185
Joined: Tue May 29, 2007 5:50 am

Post by paranoid »

Thank you sai. Currently the size of the &PH& folder is 263 MB.
Can i use the command given by you in DS admin to clear the files?
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Yes.
paranoid
Premium Member
Premium Member
Posts: 185
Joined: Tue May 29, 2007 5:50 am

Post by paranoid »

Hi,

Today the jobs are failing again, however got the message 'Unable to lock project' in the PH folder logs. This error occurs exactly at 2 am EST and continues till 4 am EST. Any ideas on why this is happening?

Thanks,
Sue
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You sure it's not "Project is currently locked"? Couldn't find your message anywhere in the forums but this one I've seen myself.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Do "they" quiesce the database (or similar) between 02:00 and 04:00 for system maintenance such as backups?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
paranoid
Premium Member
Premium Member
Posts: 185
Joined: Tue May 29, 2007 5:50 am

Post by paranoid »

Hi everyone,

We have finally reseolved this issue after a struggle of 10 days or so..

After a DS restart and clearing the files in PH folder we could resolve this issue. No more failures today :)

As per IBM Support, the number of files in the PH folder should not exceed more than 1000. Since we have more than 3000 files, he advised us to clear the files in PH folder. The same suggestion is given by DS experts in this post as well.

Before clearing the folder we have restarted the DS server as well.

I am very much thankful to each and everyone who replied to this post.

Have a nice day!!

Sue :D
Post Reply