Page 1 of 2
Script for Cleaning the Log Files
Posted: Wed Feb 07, 2007 8:03 pm
by pradkumar
Hi
I would like to clear the log files on the unix box.
My box gets always full for every two to three weeks. Then we clear the logs using director and create some space.
I would like to know whether we can create a script and run it so the log files are cleared and whenever the box reaches 98% full, it should send an email alerting"RUN THE LOG CLEAR SCRIPT"
Plz provide any sugestions or ideas
Posted: Wed Feb 07, 2007 8:15 pm
by narasimha
Ken, has a utility which will help you with that. Check his website.
It would be a better practice to set up purge settings, so that the clearing of the logs happens automatically.
Posted: Wed Feb 07, 2007 9:24 pm
by DSguru2B
A monitoring service should already be in place by your SA. Adhere to the purge settings, also make sure you archive temp files and delete them regularly.
Posted: Wed Feb 07, 2007 11:42 pm
by kumar_s
Setting up the auto purge option in the director will be the best approach. If CLEAR.FILE option is used, it will remove the autopurge property.
Also clear the %PH% directory regularly. Delete the orphaned jobs.
Posted: Wed Feb 07, 2007 11:45 pm
by kumar_s
Or you can tweak and use
Kim's code for this purpose.
Posted: Wed Feb 07, 2007 11:46 pm
by pradkumar
Thanks for all of your ideas.
Actually, for my project I am setting up the auto purge.
The actual scenario is that there are some files of last year and I need to clean up them.
Also I am looking for one script which does it.I will visit ken site once
Posted: Wed Feb 07, 2007 11:57 pm
by kumar_s
Or the code provided by Ray.
Code: Select all
Open "DS_JOBS" To DSJobs.fvar
Then
ClearSelect 9
SSelect DSJobs.fvar To 9
Loop
While ReadNext JobName From 9
If JobName Matches "1A0X"
Then
ReadV JobNumber From DSJobs.fvar, JobName, 5
Then
LogName = "RT_LOG" : JobNumber
Open LogName To Log.fvar
Then
FileLock Log.fvar
Read PurgeSettings From Log.fvar, "//PURGE.SETTINGS"
Else PurgeSettings = 0 : @FM : 0 : @FM : 0
Perform "CLEAR.FILE " : LogName
Write PurgeSettings To Log.fvar, "//PURGE.SETTINGS"
Write 1 To Log.fvar, "//SEQUENCE.NO"
Write "Log purged by routine." To Log.fvar, 0
FileUnlock Log.fvar
Close Log.fvar
End
End
End
Repeat
Close DSJobs.fvar
End
Posted: Wed Feb 07, 2007 11:58 pm
by ray.wurlod
Clearing the DataStage logs won't recover all that much space. Make sure that &PH& and the directory pointed to by UVTEMP are kept as clean as possible too. Also periodically clean up any log/data files used by database utilities and which are no longer needed.
Posted: Thu Feb 08, 2007 7:43 am
by kduke
I do not use the code posted. I doubt if Ray uses his. If you are running out of space because of log files then you have bigger problems. I like the autopurge settings. You need to get rid of any warning that you can.
Posted: Thu Feb 08, 2007 1:06 pm
by pradkumar
How to get to UVTEMP (I mean path)
Right now I am searching with Ascential/DataStage/DSEngine/..
But UVTEMP is not fouond
Posted: Thu Feb 08, 2007 3:35 pm
by ray.wurlod
UVTEMP is a setting in the uvconfig configuration file. You can get its value using the
analyze.shm command.
Code: Select all
$DSHOME/bin/analyze.shm -t | grep UVTEMP
Posted: Thu Feb 08, 2007 5:24 pm
by pradkumar
Thx ..Thanks a lot
Having one more question.
When I login to my &PH& directory, there were many DSD.RUN_nnnn_nnnn
Q1) Whta this DSD.RUN stands for
Q2) How to delete them based on user name instead of one by one. Like one user "yyyy" has 100 files
Posted: Thu Feb 08, 2007 11:29 pm
by ray.wurlod
1. DSD.RUN is the command DataStage uses to run server jobs. DSD.StageRun is the command DataStage uses to execute code generated by server Transformer stages. Files in &PH& with names ending in "trace" are from stage tracing.
2. Use the find command, filter on owner name, and exec the rm command.
Code: Select all
cd '&PH&'
find . -user yyyy -exec rm {} \;
Posted: Fri Feb 09, 2007 12:36 am
by kumar_s
Ray mean to say, find all the files created by particular and delete it.
find . * -user yyyy -exec rm {} \;
Posted: Fri Feb 09, 2007 2:13 am
by ray.wurlod
No I didn't. The asterisk is wrong. The dot was missing; I have edited my previous post to include it.