find and exec

first we can use find's exec opiton

find . -name CVS -exec ls -dl {} \;

also we can use shell

for i in `find . -name CVS` ; do echo -n $i," "; done

change ls, echo to whatever suits you, like rm -rf


Or you can use xargs:

find . -name CVS |xargs ls -dl

If you start long running processes with xargs you can also use -n1 (one arg per instance) and -P4 to use for example 4 cores.

Yes, find|xargs is almost always superior to find -exec, for me. -exec runs the command for each file, while xargs will take as many files on the command line as the shell will allow, which can drastically improve performance. Also, it makes it possible to stick other commands in between the two. To find all files in a directory tree that contain the word 'foo' and change that word to 'bar':

find . -type f | xargs grep -l foo | xargs perl -pie 's/foo/bar/g'

You can also use the -print0 option for find and the -0 option to xargs to make sure that filenames with spaces and other special characters in them are handled correctly. In the OP's example of wrapping find in a for loop, any filenames with spaces will be passed to the command inside the loop with their parts broken up.

But most of all, find|xargs means you don't have to remember what does and doesn't have to be escaped in the -exec syntax.

There are a couple of gotchas with the various approaches above.

Going backwards:

If you're using "xargs", you might as well pass "-print0" to find, and "-0" to xargs, like this:

find . -type f -print0 | xargs -0 grep -l foo

This has the advantage of not getting hosed if your file's name contains a space character. This is how I'd do this sort of thing if it absolutely, positively, had to work.

As for using the shell:

for i in `find . -name CVS` ; do echo -n $i," "; done

This has two problems. One is that the backticks don't nest; for that reason I prefer

for i in $(find . -name CVS) ; do echo -n $i," "; done

But that's a minor problem. A bigger problem is that your shell can only take command lines of a certain length; if the "find" returns (say) 100,000 file names, your whole pipeline will probably break.

Executive summary: use -print0 and xargs.

If you have a GNU find from recent years, you can skip xargs by using + instead of ;:

find . -type f -exec grep -l foo +

But there is no equivalent to xargs’s -P switch in this case.

Also, if you have GNU find of any vintage, you can use the -printf switch to do the entire job of the for loop from the previous examples:

find . -name CVS -printf '%p, '

Considering this is, I feel like we need to mention find2perl (translates a find-style command line into equivalent Perl code). All the power of Perl, with the rapid development and flexibility of the find command line.

Leave a comment

About Michael Li

user-pic I blog issues resolved at work