We need to read the unix log files of datastage jobs from operating system and use them as an input to HP openview.
Is it possible to read the jobwise files (I know they are numbered for a job) and can they be used as an input for some tools like HP openview which can give conise summary of failed jobs on their screen using these log files ?
Early reply will be appreaciated.
thanks
Mukund
Reading datastage log files in unix
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 103
- Joined: Tue Oct 14, 2003 4:07 am
Re: Reading datastage log files in unix
dsjob (search on this site) will allow you to read a log (it's part of DataStage on the server). Some problems:
you have to specify the name of the job you want to see the logfile of. If you have a lot of jobs it takes an awful lot of time and resources to do all, ...
I created similar stuff to integrate with Tivoli to monitor some sequencer jobs. To do it for all jobs will probably too much of a workload.
Ogmios
you have to specify the name of the job you want to see the logfile of. If you have a lot of jobs it takes an awful lot of time and resources to do all, ...
I created similar stuff to integrate with Tivoli to monitor some sequencer jobs. To do it for all jobs will probably too much of a workload.
Ogmios
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You could create a DataStage job to dump the requisite contents from DataStage job logs into, for example, delimited text files.
You could then create a job control routine to process a particular selection of jobs in the project (maybe all of them, maybe just the ones with a status of DSJS.RUNWARN, DSJS.FAILED or DSJS.CRASHED), for each of these executing your "dump log" job.
Metadata for the RT_LOGnnn files is excluded from being imported, but can be listed. It has, in the past, been posted on this forum, too.
You could then create a job control routine to process a particular selection of jobs in the project (maybe all of them, maybe just the ones with a status of DSJS.RUNWARN, DSJS.FAILED or DSJS.CRASHED), for each of these executing your "dump log" job.
Metadata for the RT_LOGnnn files is excluded from being imported, but can be listed. It has, in the past, been posted on this forum, too.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Source is either a Hashed File stage or UniVerse stage. Table name is RT_LOGnnn, where nnn is the job number from DS_JOBS. I assume you can use a Sequential File stage to create your delimited text file. Search the forum for the column definitions in RT_LOGnnn. You will need to create them in the Repository, since they are explicitly blocked from being imported. Though I think I may once have described a workaround for that.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.