Job running continuosly

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
atul sharma
Premium Member
Premium Member
Posts: 17
Joined: Thu Jun 30, 2005 6:52 am
Location: United States
Contact:

Job running continuosly

Post by atul sharma »

Hi All

Some of the jobs running in production region keep on running continously.

This happens occasionaly. Is there any particular reason why this happens.

Could you please let me know how to avoid it.

thanks in advance
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

well. datastage is not expected to run its job in breaks. :wink:

can you specify your problem clearly?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Atul, without detailing exactly what happens there is little anyone can do. It is like asking a doctor for a diagnosis after saying "Sometimes I don't feel so good".
What sort of jobs are they (do they connect to a database?). When they run continuously are the number of records processed stuck at some value and if so is it the number of rows you expect or less? Do the jobs use CPU when in this state?
prabu
Participant
Posts: 146
Joined: Fri Oct 22, 2004 9:12 am

Re: Job running continuosly

Post by prabu »

atul sharma wrote:Hi All

Some of the jobs running in production region keep on running continously.

This happens occasionaly. Is there any particular reason why this happens.

Could you please let me know how to avoid it.

thanks in advance
Assuming you use fetch from a database/file:-
1)check if there is an ongoing DML happening before your job is started.
this will cause your job to fetch a "read-consistent view " and it may delay the fetch
2)input files shouldn't be written, when you read

in short, check for any parallel processses that share the same resource used by the job.

on a lighter note, is it not better to have the job "keep on running " better than job aborting before you could say "Jack Robinson" :) :)
atul sharma
Premium Member
Premium Member
Posts: 17
Joined: Thu Jun 30, 2005 6:52 am
Location: United States
Contact:

Post by atul sharma »

It is a PX job.
The moment I run it, in logs it shows me starting job
followed by info: Environment variable settings.

It goes till here. Then it hangs it seems.

The job has

Source an oracle stage

a lookup which also uses another oracle stage

a transformer and two output links going respectively again to oracle stages.

When I try to stop the job, still it shows the status as running.
I have to then use "Clear Status File" option to clear the status.

Is there any other means to stop the job or using clear status file option is valid here.

thanks in advance
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

check for the following:
1. if you see any rows propogating through the job
2. If you see the process id in "clear resourses" in manager

It seems that your job is getting killed in unix, maybe due to high memory it requires to run. (just an assumption)

there are many options to stop a job run:
1. kill the process id using "clear resources" in manager.
2. use UV command line editor.

search on this forum to get individual details on usage
pkomalla
Premium Member
Premium Member
Posts: 44
Joined: Tue Mar 21, 2006 6:18 pm

Post by pkomalla »

Hi Atul,

can u check if the same oracle tables are being used parallely by any other job or used in the same job.

If so check if INITRANS AND FREELISTS of the tables are 1 or more(more than th no of times it is used parallely).

If it is 1 and used parallely then it is the problem. Change the value
meena
Participant
Posts: 430
Joined: Tue Sep 13, 2005 12:17 pm

Post by meena »

Hi atul sharma,
I think this is happening because of the database.Probably it is taking time to run the sql for extracting from the databases which may have many tables in it and many applications running on this database. Copy one of your sql and test it in toad based on the production table..If it is slow in generating the results(it is down...)..then you can check with database DBA regarding this..And it also depends data you have in table and SQL you are using....Even after killing the process in datastage the application may still run at database side and it may slown down the processes and takes long time to run.....I am not sure...Even I got the same issue but i sloved it by concering my database DBA...
atul sharma wrote:It is a PX job.
The moment I run it, in logs it shows me starting job
followed by info: Environment variable settings.

It goes till here. Then it hangs it seems.

The job has

Source an oracle stage

a lookup which also uses another oracle stage

a transformer and two output links going respectively again to oracle stages.

When I try to stop the job, still it shows the status as running.
I have to then use "Clear Status File" option to clear the status.

Is there any other means to stop the job or using clear status file option is valid here.

thanks in advance
Post Reply