Page 1 of 1

Posted: Tue Apr 15, 2014 2:27 pm
by asorrell
As you can see, it can be hard to automate a delete because some of the sub-files were created or updated at different times. And you are correct in assuming that the find command would cause problems by leaving "stragglers".

There's not an easy solution I'm aware of to tell UNIX to "remove this directory if everything in it is older that 180 days, and leave everything alone otherwise".

No matter what method you use, I recommend the following:

1) Remove the entire "filename" directory and all its sub-files.
2) Delete the matching dictionary (D_filename).

Posted: Tue Apr 15, 2014 3:00 pm
by pk7
Thanks Andy

But even so, ALL the DATA.30 files (dozens of them in different directories) have exactly the same recent date and time. There is no way that they were all updated at the same time, especially in a Development environment where I know for certain that some of these hashed files have not been touched in years! Yet they have this recent date. I don't know why that is but I suspect that we can delete these files despite the recent date. But we need to know that the data in them has not been touched recently.

Posted: Tue Apr 15, 2014 3:36 pm
by chulett
From what I recall those timestamps can be updated when the hashed file is read / accessed, not just when content changes happen.