Script for Cleaning the Log Files
Moderators: chulett, rschirm, roy
Script for Cleaning the Log Files
Hi
I would like to clear the log files on the unix box.
My box gets always full for every two to three weeks. Then we clear the logs using director and create some space.
I would like to know whether we can create a script and run it so the log files are cleared and whenever the box reaches 98% full, it should send an email alerting"RUN THE LOG CLEAR SCRIPT"
Plz provide any sugestions or ideas
I would like to clear the log files on the unix box.
My box gets always full for every two to three weeks. Then we clear the logs using director and create some space.
I would like to know whether we can create a script and run it so the log files are cleared and whenever the box reaches 98% full, it should send an email alerting"RUN THE LOG CLEAR SCRIPT"
Plz provide any sugestions or ideas
Pradeep Kumar
Setting up the auto purge option in the director will be the best approach. If CLEAR.FILE option is used, it will remove the autopurge property.
Also clear the %PH% directory regularly. Delete the orphaned jobs.
Also clear the %PH% directory regularly. Delete the orphaned jobs.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Or the code provided by Ray.
Code: Select all
Open "DS_JOBS" To DSJobs.fvar
Then
ClearSelect 9
SSelect DSJobs.fvar To 9
Loop
While ReadNext JobName From 9
If JobName Matches "1A0X"
Then
ReadV JobNumber From DSJobs.fvar, JobName, 5
Then
LogName = "RT_LOG" : JobNumber
Open LogName To Log.fvar
Then
FileLock Log.fvar
Read PurgeSettings From Log.fvar, "//PURGE.SETTINGS"
Else PurgeSettings = 0 : @FM : 0 : @FM : 0
Perform "CLEAR.FILE " : LogName
Write PurgeSettings To Log.fvar, "//PURGE.SETTINGS"
Write 1 To Log.fvar, "//SEQUENCE.NO"
Write "Log purged by routine." To Log.fvar, 0
FileUnlock Log.fvar
Close Log.fvar
End
End
End
Repeat
Close DSJobs.fvar
End
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Clearing the DataStage logs won't recover all that much space. Make sure that &PH& and the directory pointed to by UVTEMP are kept as clean as possible too. Also periodically clean up any log/data files used by database utilities and which are no longer needed.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
UVTEMP is a setting in the uvconfig configuration file. You can get its value using the analyze.shm command.
Code: Select all
$DSHOME/bin/analyze.shm -t | grep UVTEMP
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
1. DSD.RUN is the command DataStage uses to run server jobs. DSD.StageRun is the command DataStage uses to execute code generated by server Transformer stages. Files in &PH& with names ending in "trace" are from stage tracing.
2. Use the find command, filter on owner name, and exec the rm command.
2. Use the find command, filter on owner name, and exec the rm command.
Code: Select all
cd '&PH&'
find . -user yyyy -exec rm {} \;
Last edited by ray.wurlod on Fri Feb 09, 2007 2:11 am, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: