Phantom processes are not exiting after a job has completed

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
tirumal_nit
Participant
Posts: 20
Joined: Fri May 16, 2008 3:00 am
Location: bangalore

Phantom processes are not exiting after a job has completed

Post by tirumal_nit »

Ascential DRS 7.5 - phantom processes are not exiting after a job has completed. These process can only be killed manually on UNIX using a "kill" command.

As per my knowledge these phantom processes should be exited as the job finishes.

Please help me out in this regard.


Thanks in advance.
Thanks,
Tirumal G
meet_deb85
Premium Member
Premium Member
Posts: 132
Joined: Tue Sep 04, 2007 11:38 am
Location: NOIDA

Post by meet_deb85 »

what your job is doing ?

can you please explain your job in detail
tirumal_nit
Participant
Posts: 20
Joined: Fri May 16, 2008 3:00 am
Location: bangalore

Post by tirumal_nit »

Hi,

It is a simple job which reads the data from a source DRS stage and write to target DRS stage.

Source DRS stage --> CRC logic --> Target DRS stage.
Thanks,
Tirumal G
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So... your job completes without warnings or errors and then you find leftover processes? Can you be precise about exactly what 'phantom processes' you are finding and killing? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

I'd also recommend looking in the &PH& sub-directory of the project and seeing if there are any messages in the phantom files for the jobs. Usually there is a message in there that has more detail. There may be a ton of files in the &PH& sub-directory since it never cleans out, even after a clean run. To make finding the right files easy, you can either delete them, or archive them if you really want to keep a copy of the old logs. Just make sure no jobs are active when you clean up the old logs. Then when you run your job the only files in there are related to the job you just ran.

I just noticed that you are on UNIX, remember the & is a special character, so if you use it in a path you'll need to put a back-slash in front of each occurrence (ie: cd \&PH\&) to avoid issues.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
tirumal_nit
Participant
Posts: 20
Joined: Fri May 16, 2008 3:00 am
Location: bangalore

Post by tirumal_nit »

Thank you Chulett and Andy Sorrell.

I'll check the &PH& entries and let you know the exact messages.

Thank you for your suggestion.
Thanks,
Tirumal G
Post Reply