Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using find from sorted by
Terminal - Commands using find - 1,022 results
perl -MFile::Find=find -MFile::Spec::Functions -Tlwe '$found=1; find { wanted => sub { if (/$ARGV[0]\.pm\z/) { print canonpath $_; $found=0; } }, no_chdir => 1 }, @INC; exit $found;' Collectd/Plugins/Graphite
2013-01-11 11:01:46
User: keymon
Functions: exit find perl
-2

Will check if the given module is installed in the @INC. It will print the path and return 0 if found, or 1 otherwise.

Based on script from SharpyWarpy in http://www.linuxquestions.org/questions/linux-general-1/how-to-list-all-installed-perl-modules-216603/

find . -name "*.pdf" -exec pdftk {} dump_data output \; | grep NumberOfPages | awk '{s+=$2} END {print s}'
find . -maxdepth 1 -type l
find . -type l -exec test ! -e {} \; -delete
2012-12-26 06:27:13
User: seb1245
Functions: find test
Tags: find
2

This command is adapted from http://otomaton.wordpress.com/2012/12/26/find-broken-symbolic-links/

Solutions with

find -L

don't work when the link is a loop, an error message is printed.

find -type f | xargs file | grep ".*: .* text" | sed "s;\(.*\): .* text.*;\1;"
find . -type d | while read dir ; do num=`ls -l $dir | grep '^-' | wc -l` ; echo "$num $dir" ; done | sort -rnk1 | head
diff <(cd dir1 && find . | sort) <(cd dir2 && find . | sort)
find . -name "*.[ch]" -exec grep -i /dev/null "search pharse" {} \;
2012-12-04 20:51:04
User: MikeGoerling
Functions: find grep
Tags: find grep
0

Old Sys5 system and SUN computers don't have the -H option. Adding /dev/null forces grep to use the multi-file output and report the file name.

find . -type f -printf "%T@ %Tc %p\n" |sort -n |cut -d' ' -f2- |tail -n20
find . | xargs perl -p -i.bak -e 's/oldString/newString/;'
2012-11-28 17:11:18
User: RedFox
Functions: find perl xargs
0

find . = will set up your recursive search. You can narrow your search to certain file by adding -name "*.ext" or limit buy using the same but add prune like -name "*.ext" -prune

xargs =sets it up like a command line for each file find finds and will invoke the next command which is perl.

perl = invoke perl

-p sets up a while loop

-i in place and the .bak will create a backup file like filename.ext.bak

-e execute the following....

's/ / /;' your basic substitute and replace.

find . -size 0c -print -exec rm -f {} \;
find . \( -name \*.cgi -o -name \*.txt -o -name \*.htm -o -name \*.html -o -name \*.shtml \) -print | xargs grep -s pattern
find . -type f -exec grep -ils stringtofind {} +
find . -name "*" -print | xargs grep -s pattern
for I in $(find . -depth -type d -not -path "*/.svn*" -print) ; do N="$(ls -1A ${I} | wc -l)"; if [[ "${N}" -eq 0 || "${N}" -eq 1 && -n $(ls -1A | grep .svn) ]] ; then svn rm --force "${I}"; fi ; done
find . -type f -regex '.*html$' -exec sed -i 's/\xEF\xBB\xBF//' '{}' \;
find / -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l
find . -type f -exec grep -l "some string" {} \;
mplayer $(find . -iname '*.avi' | shuf -n1)
find . -type f |egrep '^./.*\.' |sed -e "s/\(^.*\.\)\(.*$\)/\2/" |sort |uniq
2012-11-12 17:17:55
User: dvst
Functions: egrep find sed sort
0

find files recursively from the current directory, and list the extensions of files uniquely

find . -maxdepth 2 -type d -name '.git' -print0 | while read -d ''; do (cd "$REPLY"; git gc); done
2012-11-07 08:38:33
User: unhammer
Functions: cd find read
Tags: git drivespace
-1

Assumes you've cd'd to the folder in which all your git repos reside; you could run it from ~ without -maxdepth, although that might make find take quite a while longer.

If you have several processor cores, but not that much ram, you might want to run

git config --global pack.threads 1

first, since gc-ing can eat lots of ram.

find . -printf "touch -m -d \"%a\" '%p'\n" | tee /tmp/retime.sh
2012-11-05 20:32:05
User: dmmst19
Functions: find tee
4

Sometimes when copying files from one place to another, the timestamps get lost. Maybe you forgot to add a flag to preserve timestamps in your copy command. You're sure the files are exactly the same in both locations, but the timestamps of the files in the new home are wrong and you need them to match the source.

Using this command, you will get a shell script (/tmp/retime.sh) than you can move to the new location and just execute - it will change the timestamps on all the files and directories to their previous values. Make sure you're in the right directory when you launch it, otherwise all the touch commands will create new zero-length files with those names. Since find's output includes "." it will also change the timestamp of the current directory.

Ideally rsync would be the way to handle this - since it only sends changes by default, there would be relatively little network traffic resulting. But rsync has to read the entire file contents on both sides to be sure no bytes have changed, potentially causing a huge amount of local disk I/O on each side. This could be a problem if your files are large. My approach avoids all the comparison I/O. I've seen comments that rsync with the "--size-only" and "--times" options should do this also, but it didn't seem to do what I wanted in my test. With my approach you can review/edit the output commands before running them, so you can tell exactly what will happen.

The "tee" command both displays the output on the screen for your review, AND saves it to the file /tmp/retime.sh.

Credit: got this idea from Stone's answer at http://serverfault.com/questions/344731/rsync-copying-over-timestamps-only?rq=1, and combined it into one line.

find . -type f -print | awk -F'.' '{print $NF}' | sort | uniq -c
find /test -type f -printf "%AY%Aj%AH%AM%AS---%h/%f\n" | sort -n