Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using find from sorted by
Terminal - Commands using find - 1,045 results
find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
2009-08-13 13:13:33
User: fsilveira
Functions: cut find ls sort tr xargs
Tags: sort find ls
10

This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit.

This command won't use command line arguments to sort the files and will display the sorted list correctly.

find . -name "*.[ch]" -exec grep "TODO" {} +
2009-08-13 06:17:22
User: peshay
Functions: find grep
Tags: grep
3

-exec works better and faster then using a pipe

find . -name "*.[ch]" | xargs grep "TODO"
find $MAILDIR/ -type f -printf '%T@ %p\n' | sort --reverse | sed -e '{ 1,100d; s/[0-9]*\.[0-9]* \(.*\)/\1/g }' | xargs -i sh -c "cat {}&&rm -f {}" | gzip -c >>ARCHIVE.gz
find . -name '*'.tiff -exec bash -c "mogrify -format jpg -quality 85 -resize 75% {} && rm {}" \;
2009-08-10 18:27:10
Functions: bash find
4

Simple command to convert a large number of images into jpeg-format. Will delete originals after conversion.

find . -name \*.pdf -exec pdfinfo {} \; | grep Pages | sed -e "s/Pages:\s*//g" | awk '{ sum += $1;} END { print sum; }'
find . -name "-help" -exec mv {} help.txt \;
find . -type f -print0 | xargs -0 -P 4 -n 40 grep -i foobar
2009-08-05 23:18:44
User: ketil
Functions: find grep xargs
4

xargs -P N spawns up to N worker processes. -n 40 means each grep command gets up to 40 file names each on the command line.

find . -name "*.gz" | xargs -n 1 -I {} bash -c "gunzip -c {} | sort | gzip -c --best > {}.new ; rm {} ; mv {}.new {}"
2009-08-05 14:16:15
User: kennethjor
Functions: bash find xargs
-2

I used this because I needed to sort the content of a bunch of gzipped log files. Replace sort with something else, or simply remove sort to just rezip everything

find . -depth -type d -empty -exec rmdir -v {} +
2009-08-05 13:48:13
User: syssyphus
Functions: find rmdir
Tags: find
7

this will show the names of the deleted directories, and will delete directories that only no files, only empty directories.

find . -empty -type d -exec rmdir {} +
2009-08-04 16:55:34
User: jsiei97
Functions: find rmdir
14

A quick way to find and delete empty dirs, it starts in the current working directory.

If you do find . -empty -type d you will see what could be removed, or to a test run.

find -type f -exec md5sum '{}' ';' | sort | uniq --all-repeated=separate -w 33 | cut -c 35-
2009-08-04 07:05:12
User: infinull
Functions: cut find md5sum sort uniq
18

Calculates md5 sum of files. sort (required for uniq to work). uniq based on only the hash. use cut ro remove the hash from the result.

find . -type d -exec env d="$dest_root" sh -c ' exec mkdir -p -- "$d/$1"' '{}' '{}' \;
find . -maxdepth 2 -name "*somepattern" -print0 | xargs -0 -I "{}" echo mv "{}" /destination/path
2009-08-01 01:55:47
User: jonasrullo
Functions: echo find mv xargs
3

Only tested on Linux Ubunty Hardy. Works when file names have spaces. The "-maxdepth 2" limits the find search to the current directory and the next one deeper in this example. This was faster on my system because find was searching every directory before the current directory without the -maxdepth option. Almost as fast as locate when used as above. Must use double quotes around pattern to handle spaces in file names. -print0 is used in combination with xargs -0. Those are zeros not "O"s. For xargs, -I is used to replace the following "{}" with the incoming file-list items from find. Echo just prints to the command line what is happening with mv. mv needs "{}" again so it knows what you are moving from. Then end with the move destination. Some other versions may only require one "{}" in the move command and not after the -I, however this is what worked for me on Ubuntu 8.04. Some like to use -type f in the find command to limit the type.

find . -maxdepth 1 -type f | wc -l
2009-07-31 14:53:29
User: guckes
Functions: find wc
Tags: wc
6

A simple "ls" lists files *and* directories. So we need to "find" the files (type 'f') only.

As "find" is recursive by default we must restrict it to the current directory by adding a maximum depth of "1".

If you should be using the "zsh" then you can use the dot (.) as a globbing qualifier to denote plain files:

zsh> ls *(.) | wc -l

for more info see the zsh's manual on expansion and substitution - "man zshexpn".

find . -type f -name "*.c" -exec cat {} \; | wc -l
2009-07-30 10:06:51
User: foremire
Functions: cat find wc
1

use find to grep all .c files from the target directory, cat them into one stream, then piped to wc to count the lines

find . -iname '*filename*.doc' | { while read line; do antiword "$line"; done; } | grep -C4 search_term;
2009-07-28 15:49:58
User: Ben
Functions: find grep read
3

Find Word docs by filename in the current directory, convert each of them to plain text using antiword (taking care of spaces in filenames), then grep for a search term in the particular file.

(Of course, it's better to save your data as plain text to make for easier grepping, but that's not always possible.)

Requires antiword. Or you can modify it to use catdoc instead.

find -name '*oldname*' -print0 | xargs -0 rename 's/oldname/newname/'
2009-07-27 00:44:06
Functions: find rename xargs
0

This is better than doing a "for `find ...`; do ...; done", if any of the returned filenames have a space in them, it gets mangled. This should be able to handle any files.

Of course, this only works if you have rename installed on your system, so it's not a very portable command.

find public_html/stuff -type d -exec chmod 755 {} + -or -type f -exec chmod 644 {} +
2009-07-26 11:10:10
User: recursiverse
Functions: chmod find
7

Good for fixing web permissions. You might also want to do something like this and skip files or directories that begin with a period:

find public_html/stuff -not -name ".*" \( -type d -exec chmod 755 {} + -o -type f -exec chmod 644 {} + \)

...or include a special case for scripts:

find public_html/stuff -type d -exec chmod 755 {} + -or -type f -name "*.pl" -exec chmod 755 {} + -or -exec chmod 644 {} +
find . *oldname* | grep oldname | perl -p -e 's/^(.*)(oldname)(.*$)/mv $1$2$3 $1newname$3/' | sh
DD=`cat /etc/my.cnf | sed "s/#.*//g;" | grep datadir | tr '=' ' ' | gawk '{print $2;}'` && ( cd $DD ; find . -mindepth 2 | grep -v db\.opt | sed 's/\.\///g; s/\....$//g; s/\//./;' | sort | uniq | tr '/' '.' | gawk '{print "CHECK TABLE","`"$1"`",";";}' )
2009-07-25 03:42:31
User: atcroft
Functions: cd find gawk grep sed sort tr uniq
-1

This command will generate "CHECK TABLE `db_name.table_name` ;" statements for all tables present in databases on a MySQL server, which can be piped into the mysql command. (Can also be altered to perform OPTIMIZE and REPAIR functions.)

Tested on MySQL 4.x and 5.x systems in a Linux environment under bash.

find / | xargs ls -l | tr -s ' ' | cut -d ' ' -f 1,3,4,9
find . -type f ! -name "*html"
find . -name '*.html' -print0| xargs -0 -L1 cat |sed "s/[\"\<\>' \t\(\);]/\n/g" |grep "http://" |sort -u
2009-07-14 07:00:15
User: jamespitt
Functions: cat find grep sed sort xargs
4

Just a handy way to get all the unique links from inside all the html files inside a directory. Can be handy on scripts etc.

find . -not \( -name .svn -prune \) -type f -print0 | xargs --null grep <searchTerm>
2009-07-08 20:08:05
User: qazwart
Functions: find grep xargs
Tags: find xargs grep
8

By putting the "-not \( -name .svn -prune \)" in the very front of the "find" command, you eliminate the .svn directories in your find command itself. No need to grep them out.

You can even create an alias for this command:

alias svn_find="find . -not \( -name .svn -prune \)"

Now you can do things like

svn_find -mtime -3