Page 1 of 1

Hashed file cleanup

Posted: Thu Jun 28, 2012 2:55 pm
by mdbatra
Hi All,

while freeing up space on the production Unix box, it has been observed that the hashed files directory(other than the account one) is grabbing 100+GB.

As these are non-account hashed files, could simply be deleted by rm -r <file_name> and rm D_<file_name> . But there are jobs in DS which don't delete and create the hashed files afresh, few of those just clear up and use the existing one.

Hence, to be on safer side, we are thinking if we coule purge/empty these hashed files ? I want to check if the general unix command to empty a test file ( i.e. > test.txt ) would be fine for hashed files(for data and overflow files) as well or there could be some issue ?

Or , any other way of just clearing up hashed files in non-account unix directory ?

Thanks

Posted: Thu Jun 28, 2012 3:18 pm
by chulett
You don't need to clear them, any deleted hashed files will simply be recreated the next time the job runs that writes to them. Yes, even if that option isn't specifically checked.

Posted: Thu Jun 28, 2012 3:34 pm
by mdbatra
That's awesome :)

and how about account ones ? we don't have a quick delete mechanism for those as understood during a recent post of mine over same. Can we clear those like the way i mentioned ?

I guess not otherwise what would have been the need of all together a new command CLEAR.FILE :(

No worries...i'd still gain 100GB..that would be fine :) Thanks Craig for your prompt and helpful response.

Cheers.