grep (or anything else) many files with multiprocessor power

find . -type f | parallel -j+0 grep -i foobar
Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores. With parallel you only need -0 (and -print0) if your filenames contain a '\n'. Parallel is from https://savannah.nongnu.org/projects/parallel/

5
2010-01-30 02:08:46

1 Alternatives + Submit Alt

What Others Think

Have you done some tests to estimate the gain ? I haven't a multiprocessor computer but I not totally convinced. I think find command is more limited by disk rate than by cpu, but I could be wrong.
Kysic · 437 weeks and 4 days ago
It is very dependent on file system whether it is faster. But if the files are cached (e.g. if you just searched for another string), then it is definitely faster.
unixmonkey10455 · 412 weeks and 1 day ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: