What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands tagged find from sorted by
Terminal - Commands tagged find - 364 results
find . -type f \( -name "*.js" -o -name "*.php" -o -name "*.inc" -o -name "*.html" -o -name "*.htm" -o -name "*.css" \) -exec grep -il 'searchString' {} \;
2010-02-07 15:28:20
User: niels_bom
Functions: find grep
Tags: find grep search

Use find to recursively make a list of all files from the current directory and downwards. The files have to have an extension of the ones listed. Then for every file found, grep it for 'searchString', returns the filename if searchString is found.

find /path/to/dir -type f -printf "%T@|%p\n" 2>/dev/null | sort -n | tail -n 1| awk -F\| '{print $2}'
newest () { find ${1:-\.} -type f |xargs ls -lrt ; }
newest () { DIR=${1:-'.'}; CANDIDATE=`find $DIR -type f|head -n1`; while [[ ! -z $CANDIDATE ]]; do BEST=$CANDIDATE; CANDIDATE=`find $DIR -newer "$BEST" -type f|head -n1`; done; echo "$BEST"; }
2010-02-04 12:40:44
User: shadycraig
Functions: echo head

Works recusivley in the specified dir or '.' if none given.

Repeatedly calls 'find' to find a newer file, when no newer files exist you have the newest.

In this case 'newest' means most recently modified. To find the most recently created change -newer to -cnewer.

find . -type f |xargs -I% sed -i '/group name/s/>/ deleteMissing="true">/' %
2010-02-01 21:09:57
User: 4fthawaiian
Functions: find sed xargs

Changed out the for loop for an xargs. It's a tad shorter, and a tad cleaner.

for i in `find . -type f`; do sed -i '/group name/s/>/ deleteMissing="true">/' $i; done
2010-02-01 17:16:37
User: allrightname
Functions: sed

Recursively replace a string in files with lines matching string. Lines with the string "group name" will have the first > character replaced while other > characters on other lines will be ignored.

find directory/ -exec grep -ni phrase {} +
2010-01-28 12:15:24
User: sanmiguel
Functions: find grep
Tags: find grep

The difference between this and the other alternatives here using only grep is that find will, by default, not follow a symlink. In some cases, this is definitely desirable.

Using find also allows you to exclude certain files, eg

find directory/ ! -name "*.tmp" -exec grep -ni phrase {} +

would allow you to exclude any files .tmp files.

Also note that there's no need for calling grep recursively, as find passes each found file to grep.

find -type d -name ".svn" -prune -o -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type d -name ".svn" -prune -o -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
2010-01-28 09:45:29
User: 2chg
Functions: find md5sum sort uniq xargs

Improvement of the command "Find Duplicate Files (based on size first, then MD5 hash)" when searching for duplicate files in a directory containing a subversion working copy. This way the (multiple dupicates) in the meta-information directories are ignored.

Can easily be adopted for other VCS as well. For CVS i.e. change ".svn" into ".csv":

find -type d -name ".csv" -prune -o -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type d -name ".csv" -prune -o -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
find . -type d -empty -delete
find /path/to/images -name '*.JPG' -exec bash -c 'mv "$1" "${1/%.JPG/.jpg}"' -- {} \;
2010-01-07 15:41:17
User: sorpigal
Functions: bash find
Tags: bash find mv

Recursively rename .JPG to .jpg using standard find and mv. It's generally better to use a standard tool if doing so is not much more difficult.

find /path/to/images -name '*.JPG' -exec rename "s/.JPG/.jpg/g" \{\} \;
2010-01-02 19:12:37
User: renich
Functions: find rename
Tags: find rename

This command is useful for renaming a clipart, pic gallery or your photo collection. It will only change the big caps to small ones (on the extension).

find . -type f -exec sed -i s/oldstring/newstring/g {} +
2009-12-09 00:46:13
User: SlimG
Functions: find sed
Tags: sed find

This command find all files in the current dir and subdirs, and replace all occurances of "oldstring" in every file with "newstring".

find . -type d -exec sh -c "normalize-audio -b \"{}\"/*.mp3" \;
2009-12-08 03:13:13
Functions: find sh

Execute this in the root of your music library and this recurses through the directories and normalizes each folder containing mp3s as a batch. This assumes those folders hold an album each. The command "normalize-audio" may go by "normalize" on some systems.

grep -H -n "pattern" *
2009-11-24 08:48:38
Functions: grep
Tags: find

if its the current directory, no need find command. just grep will do

find ./ -name $1 -exec grep -H -n $2 '{}' ';'
find <path> -name "*.tgz" -or -name "*.tar.gz" | while read file; do echo "$file: "; tar -tzf $file; done
2009-11-10 20:39:04
User: polaco
Functions: echo find read tar
Tags: find tar list

This script will list all the files in the tarballs present on any folder or subfolder of the provided path. The while loop is for echoing the file name of the tarball before listing the files, so the tarball can be identified

find . -mmin -60 -not -path "*svn*" -print|more
2009-11-10 18:34:53
User: bloodykis
Functions: find
Tags: bash svn find

Find files recursively that were updated in the last hour ignoring SVN files and folders. Incase you do a full svn up on accident.

find . -size 0 -exec rm '{}' \;
mysqldump -uUSERNAME -pPASSWORD database | gzip > /path/to/db/files/db-backup-`date +%Y-%m-%d`.sql.gz ;find /path/to/db/files/* -mtime +5 -exec rm {} \;
find . -name "*.txt" -exec sed -i "s/old/new/" {} \;
find -type f -name "*.avi" -print0 | xargs -0 mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
2009-09-24 15:50:39
User: syssyphus
Functions: find perl printf tail xargs

change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.

find . -name "*.txt" | xargs sed -i "s/old/new/"
find $HOME -type f -print0 | perl -0 -wn -e '@f=<>; foreach $file (@f){ (@el)=(stat($file)); push @el, $file; push @files,[ @el ];} @o=sort{$a->[9]<=>$b->[9]} @files; for $i (0..$#o){print scalar localtime($o[$i][9]), "\t$o[$i][-1]\n";}'|tail
2009-09-21 22:11:16
User: drewk
Functions: find perl

This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting.

Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl?

Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links....

Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.

mate - `find . -name 'filename'`
find . \( ! -name . -prune \) \( -type f -o -type l \)
2009-09-12 15:58:56
User: mobidyc
Functions: find

you must be in the directory to analyse

report all files and links in the currect directory, not recursively.

this find command ahs been tested on hp-ux/linux/aix/solaris.