Monitoring Jobs

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Monitoring Jobs

Post by pankajg »

I have a multiple instance job, I checked the job in the monitor only to find that the transformer stage finished with an 'Unknown' status, and the over all job finished with a status of finished.

When i try to see the detailed log of the job for that particular instance I cannot see the logs for that instance and also no output is generated.

For the other instances of the same job I can see the detailed information in the monitor and get to see the log file and the output is also generated.

Can someone help me with understanding this particular behavior of data stage.??

Thanks in advance.
Failures push you towards Success.
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

Post by Kirtikumar »

For every job, there is a setting for log files called as Auto Purge. Using this setting, you can ask the DS to clear the logs automatically say after 3 runs or may be 3 days.
This setting is at project level as well as job. If this setting is done at project level, each new job created will inherit it from project by default. But you can override it in job.

So very first thing is check what is set for this auto purge in the job which you are monitoring. To check it, go to DS Director, select the job for which you want to monitor the logs, and select Jobs--> Clear log. It will open window having check box on Immidiate clear and Auto Purge.
From it, select the Auto Purge radio button and then see the default values.
Here you can disable it or set to according to your need like store logs for 10 days or 20 runs etc. Disabling it will coz the log file to have all the logs for all runs and may eat up storage space if not cleared manually from time to time. This change will overide the project default.
For more info on this, check the DS Director guide or PDF.
Regards,
S. Kirtikumar.
attawoot
Participant
Posts: 1
Joined: Tue May 31, 2005 3:31 am

Post by attawoot »

Thanks for the reply:

But as I told its a multiple instance job and I can view the logs for all the instance apart from this 1 particular instance. So, the Auto purge for this job is not active.

This has happened a couple of times, and there has been no definite solution to this, neither can I trace the cause of this issue.

Has anyone faced this issue before. Some thing that I could check.

Thanks in Advance
Pankaj
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

Post by Kirtikumar »

Have you check the Auto Purge setting by the proc mentioned?

If just due to the reason that logs for all instances are shown you coming to the conclusion that Auto Purge is not active, then it might be wrong.

Please check this setting once in logs. While checkingm make sure that you are selcting the original job and not the job instance.
Is the job showing same behavior next time as well?
Regards,
S. Kirtikumar.
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Post by pankajg »

Thanks kirti for the reply, I would check these settings and let you know about the same.

There is one more thing that I would want to know, Is there a way to export the log files out of datastage to create a seperate local log file (txt format) ?

As I know that the log file in datastage is a hash file where the logs are stored.

Thanks in advance again for you help.
Failures push you towards Success.
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

Post by Kirtikumar »

There is a way to do this.

You will have to write the basic routine which can access log file of a job. Then in it use functions like DSGetLogSummary and DSGetLogEntry. this routine can then write th logs to the seq file.

Search the forum for collecting logs and you will find some pointers how to do that.

Check Basic PDF and you will find some more functions to connect to job for which you want to collect logs. Some function that you have to use are DSattchjob, DSDetachJob.
Regards,
S. Kirtikumar.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

pankajg wrote:Is there a way to export the log files out of datastage to create a seperate local log file (txt format) ?
You can create a DataStage server job to achieve this. The only tricky part is determining the log hashed file name; you can do this using another server job (SELECT JOBNO FROM DS_JOBS WHERE NAME = '#JobName#';) and using this result to supply the log hashed file name as a job parameter to the main job using an expression such as

Code: Select all

Convert(" ", "", "RT_LOG" : FirstJob.$UserStatus)
Last edited by ray.wurlod on Wed Oct 11, 2006 2:21 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Post by pankajg »

Thanks Ray,

I searched thru the forum and found a lot of useful topic on exporting and working with the log files.
Failures push you towards Success.
Post Reply