What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




Commands tagged find from sorted by
Terminal - Commands tagged find - 360 results
find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365
2013-12-11 14:51:53
User: cuberri
Functions: find

Useful when you want to cron a daily deletion task in order to keep files not older than one year. The command excludes .snapshot directory to prevent backup deletion.

One can append -delete to this command to delete the files :

find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365 -delete
find -type f -name '*.conf' -exec sed -Ei 's/foo/bar/' '{}' \;
2013-11-21 16:07:06
Functions: find sed

note that sed -i is non-standard (although both GNU and current BSD systems support it)

Can also be accomplished with

find . -name "*.txt" | xargs perl -pi -e 's/old/new/g'

as shown here - http://www.commandlinefu.com/commands/view/223/a-find-and-replace-within-text-based-files-to-locate-and-rewrite-text-en-mass.

find ~ -type f -size +500M -exec ls -ls {} \; | sort -n
2013-11-17 13:13:14
User: marcanuy
Functions: find ls sort
Tags: size find

Find all files larger than 500M in home directory and print them ordered by size with full info about each file.

find /var/lib/cassandra/data -depth -type d -iwholename "*/snapshots/*" -printf "%Ty-%Tm-%Td %p\n" | sort
find /var/lib/cassandra/data -depth -type d -iwholename "*/snapshots/*" -mtime +30 -print0 | xargs -0 rm -rf
find . -type f -not -empty -printf "%-25s%p\n"|sort -n|uniq -D -w25|cut -b26-|xargs -d"\n" -n1 md5sum|sed "s/ /\x0/"|uniq -D -w32|awk -F"\0" 'BEGIN{l="";}{if(l!=$1||l==""){printf "\n%s\0",$1}printf "\0%s",$2;l=$1}END{printf "\n"}'|sed "/^$/d"
2013-10-22 13:34:19
User: alafrosty
Functions: awk cut find sed sort uniq xargs

* Find all file sizes and file names from the current directory down (replace "." with a target directory as needed).

* sort the file sizes in numeric order

* List only the duplicated file sizes

* drop the file sizes so there are simply a list of files (retain order)

* calculate md5sums on all of the files

* replace the first instance of two spaces (md5sum output) with a \0

* drop the unique md5sums so only duplicate files remain listed

* Use AWK to aggregate identical files on one line.

* Remove the blank line from the beginning (This was done more efficiently by putting another "IF" into the AWK command, but then the whole line exceeded the 255 char limit).

>>>> Each output line contains the md5sum and then all of the files that have that identical md5sum. All fields are \0 delimited. All records are \n delimited.

ls -l `find . -maxdepth 1 -type l -print`
find -size +100M
find -L /home/sonic/archive -name '*gz' -type f
2013-10-07 14:32:22
User: sonic
Functions: find
Tags: find

If /home/sonic/archive/ was a symlink to /backup/sonic/archive it would follow the links and give you the file listing. By default find will NOT follow symbolic links. The default behavior for the find command is to treat the symlinks as literal files.

I discovered this when trying to write a script run via cron to delete files with a modification time older than X days. The easiest solution was to use:

/usr/bin/find -L /home/sonic/archive -name '*gz' -type f -mtime +14 -exec rm '{}' \;

find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
2013-10-03 21:58:51
User: fuats
Functions: find head ls xargs

Sorts by latest modified files by looking to current directory and all subdirectories

for i in `find -L /var/ -wholename \*log\* -type d`; do COUNT=`ls -1U $i | wc -l`; if [ $COUNT -gt 10 ]; then echo $i $COUNT; fi; done
echo '#! /usr/bin/ksh\necho `cat $1 | openssl dgst -sha256` $1' > sslsha256; chmod +x sslsha256; find directory -type f -exec ./sslsha256 \{\} \;
2013-09-18 17:37:50
User: RAKK
Functions: chmod echo find

This command is for producing GNU sha256sum-compatible hashes on UNIX systems that don't have sha256sum but do have OpenSSL, such as stock IBM AIX.

1.- Saves a wrapper script for UNIX find that does the following:

A.- Feeds a file to openssl on SHA256 hash calculation mode

B.- Echoes the output followed by the filename

2.- Makes the file executable

3.- Runs find on a directory, only processing files, and running on each one the wrapper script that calculates SHA256 hashes

Pending is figuring out how to verify a sha256sum file on a similar environment.

find . -type f -print0 | xargs -0 stat -c'%Y :%y %12s %n' | sort -nr | cut -d: -f2- | head
2013-08-03 09:53:46
User: HerbCSO
Functions: cut find sort stat xargs

Goes through all files in the directory specified, uses `stat` to print out last modification time, then sorts numerically in reverse, then uses cut to remove the modified epoch timestamp and finally head to only output the last 10 modified files.

Note that on a Mac `stat` won't work like this, you'll need to use either:

find . -type f -print0 | xargs -0 stat -f '%m%t%Sm %12z %N' | sort -nr | cut -f2- | head

or alternatively do a `brew install coreutils` and then replace `stat` with `gstat` in the original command.

find . -maxdepth 1 -type d -exec sh -c "printf '{} ' ; find '{}' -type f -ls | wc -l" \;
2013-07-29 19:46:35
User: HerbCSO
Functions: find sh

For each directory from the current one, list the counts of files in each of these directories. Change the -maxdepth to drill down further through directories.

find ./ -type f -print0 | xargs -0 md5sum
find . -type d | sed -e "s/[^-][^\/]*\// |/g" -e "s/|\([^ ]\)/|-\1/"
2013-07-16 10:08:34
User: opexxx
Functions: find sed
Tags: sed find

show directory three

find -printf "%C@ %p\n"|sort
2013-06-19 10:42:49
User: oivvio
Functions: find
Tags: sort find

This uses the ability of find (at least the one from GNU findutils that is shiped with most linux distros) to display change time as part of its output. No xargs needed.

find /usr/include/ -name '*.[c|h]pp' -o -name '*.[ch]' -print0 | xargs -0 cat | grep -v "^ *$" | grep -v "^ *//" | grep -v "^ */\*.*\*/" | wc -l
2013-06-17 08:37:37
Functions: cat find grep wc xargs

Count your source and header file's line numbers. This ignores blank lines, C++ style comments, single line C style comments.

This will not ignore blank lines with tabs or multiline C style comments.

find . -type f -name filename.exe -exec sed -i "s/oldstring/oldstring/g" {} +;
find . -type f -a \! -links 1
2013-05-06 20:44:08
User: malathion
Functions: find
Tags: find links

libpurple likes to hardlink files repeatedly. To ignore libpurple, use sed: | sed '/\.\/\.purple/d'

for ii in $(find /path/to/docroot -type f -name \*.php); do echo $ii; wc -lc $ii | awk '{ nr=$2/($1 + 1); printf("%d\n",nr); }'; done
2013-04-05 19:06:17
Functions: awk echo find wc

I have found that base64 encoded webshells and the like contain lots of data but hardly any newlines due to the formatting of their payloads. Checking the "width" will not catch everything, but then again, this is a fuzzy problem that relies on broad generalizations and heuristics that are never going to be perfect.

What I have done is set an arbitrary threshold (200 for example) and compare the values that are produced by this script, only displaying those above the threshold. One webshell I tested this on scored 5000+ so I know it works for at least one piece of malware.

find ./public_html/ -name \*.php -exec grep -HRnDskip "\(passthru\|shell_exec\|system\|phpinfo\|base64_decode\|chmod\|mkdir\|fopen\|fclose\|readfile\) *(" {} \;
2013-04-03 12:42:19
User: lpanebr
Functions: find grep

Searched strings:

passthru, shell_exec, system, phpinfo, base64_decode, chmod, mkdir, fopen, fclose, readfile

Since some of the strings may occur in normal text or legitimately you will need to adjust the command or the entire regex to suit your needs.

print -rl /**/*(.f:o+w:)
2013-04-03 02:53:00
User: khayyam

Example of using zsh glob qualifier ...

"." = files

"f:" = files with access rights matching:

o+w = other plus write

find /var/www/ -type f -print0 | xargs -0 chmod 644
find /var/www/ -type f -print0 | xargs -0 chmod 644
2013-03-28 11:10:30
User: FiloSottile
Functions: chmod find xargs
Tags: find xargs chmod

xargs is a more elegant approach to executing a command on find results then -exec as -exec is meant as a filtering flag.