Page 1 of 1

Auto Purging of Dataset

Posted: Sat Dec 24, 2011 11:49 pm
by ds_avatar
Hi,

Our DS server is getting crashed very frequently, after analysis we found that disk space is eaten by datasets (huge size), since we have used them heavily in our parallel jobs.

My Question: Is there any way to delete datasets once the correp. job is processed successfully? or what is the ideal way to deal with this situation?

As work around we have written a shell script and scheduled it in corn, which delete all dataset files older than 2 days.

Posted: Sun Dec 25, 2011 12:16 am
by pandeesh
Once the job is finished , delete them using orchadmin in a after job subroutine or using execute command activity in a sequence.

Posted: Sun Dec 25, 2011 7:13 am
by roy
I think that a periodical cleansing or a preload cleansing will also leave your data sets available for some debuging in case you need them.

Posted: Sun Dec 25, 2011 8:41 am
by ds_avatar
pandeesh wrote:Once the job is finished , delete them using orchadmin in a after job subroutine or using execute command activity in a sequence.
I also thought to use execute command in job sequence, but what will be the selection criteria to pick-up the datasets belong to corresponding job.
roy wrote:I think that a periodical cleansing or a preload cleansing will also leave your data sets available for some debuging in case you need them.
How come dataset can be used for debug when it is available in non-readable format. :(

Posted: Sun Dec 25, 2011 8:47 am
by pandeesh
ds_avatar wrote:I also thought to use execute command in job sequence, but what will be the selection criteria to pick-up the datasets belong to corresponding job.
Just pass the parameter for dataset name as what you do in job level.(else hard code if you are hard coding in job level)
ds_avatar wrote:How come dataset can be used for debug when it is available in non-readable format. :(
It can be as Data set management option is there in the client tool.
IHTH(I Hope This Helps) :)