Gene Hightower
- Website: digilicious.com/
- About: I blog
Recent Actions
-
Commented on Perl to the rescue: case study of deleting a large directory
No, you're right: like 100 to 150 million....
-
Commented on Perl to the rescue: case study of deleting a large directory
When I do the math, 3-5 spam mails a second for a year gives us more like a million to a million and a half files. Even a million files is enough to choke the UFS2 file system, which is...
Comment Threads
-
karsten commented on
Perl to the rescue: case study of deleting a large directory
Thanks!
For a couple of days I hassled with exactly the same problem. 10 million files and every tool I used likes to cache the directory for "performance" reasons...
Now that script is running on a clients OSX server where the rsync/osx10.6 problem spammed a directory.
-
Alceu Rodrigues de Freitas Junior commented on
Perl to the rescue: case study of deleting a large directory
This post is quite old already, but I found it due to a post of mine in the perlmonks website.
As far as I checked, if you're in Linux, you could use a system call to SYS_getdents with syscall and avoid doing a stat() on the files and get their names to a list (or a text file).
After that it would be just a matter to call unlink on those names.
The downside is that I'm still trying to figure out how to use getdents with Perl. :-)
About blogs.perl.org
blogs.perl.org is a common blogging platform for the Perl community. Written in Perl with a graphic design donated by Six Apart, Ltd.