Purge old files?

samuel k

Registered
Does anyone know of a way to delete all files older than a certain number of days in a directory? I've created a public scratch directory on a 10.2 machine in a computer lab, and I would like to have cron running a nightly script to recursively delete all contained files and subdirectories older than 14 days. However, I haven't found out how to do that...

Please help! :)
 
It's probably best to look at the man page for find. The general use is
find path/to/startdir -someflags
and then it will run recursively from startdir looking for a case that fits your flags. I saw a flag -newer file but that just compairs the file with another file to see which is newer. there is also -ctime n but that just checks if the file is specifically n number of days old. I guess that would work if you ran the cron every day, but if you missed one day all those files would stay there.

Then the key flag is -delete which will take whatever is found and remove it. As long as you can get past the problem in cron that the computer must be running when your even occurs, this should be golden.
 
Actually, I'd approach it this way:

cd to the path you want, then...

find . -mtime +14 -print0 | xargs -0 rm -f

This would find all files that hadn't been accessed in more than 14 days, then feed the filename list, en masse, to xargs, which clumps 'em and lets rm go at it. This is typically more efficient than having the find itself do the task. Note that you could also skip the second part of the pipe and generate a list of which files match simply with:

find . -atime +14 -print

and see what you get. If you want to use creation time or modification time, rather than access time, use -ctime or -mtime instead. Ain't Unix grand? :)
 
Back
Top