Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using find from sorted by
Terminal - Commands using find - 1,048 results
find */*.c | xargs grep 'mcs'
find -regextype posix-egrep -regex '.*\.(css|js)$' | xargs -I{} sh -c "echo '{}' && yuicompressor '{}' | gzip -c > '{}.gz'"
find /path/to/dir -type f -exec cachedel '{}' \;
2013-12-12 18:22:54
User: michelsberg
Functions: find
-1

This is just another example of what the nocache package is useful for, which I described in http://www.commandlinefu.com/commands/view/12357/ and that provides the commands

nocache <command to run with page cache disabled>

cachedel <single file to remove from page cache>

cachstats <single file> # to get the current cache state

Often, we do not want to disable caching, because several file reads are involved in a command and operations would be slowed down a lot, due to massive disk seeks. But after our operations, the file sits in the cache needlessly, if we know we're very likely never touching it again.

cachedel helps to reduce cache pollution, i.e. frequently required files relevant for desktop interaction (libs/configs/etc.) would be removed from RAM.

So we can run cachedel after each data intensive job. Today I run commands like these:

<compile job> && find . -type f -exec cachedel '{}' \; &> /dev/null # no need to keep all source code and tmp files in memory

sudo apt-get dist-upgrade && find /var/cache/apt/archives/ -type f -exec cachedel '{}' \; # Debian/*buntu system upgrade

dropbox status | grep -Fi idle && find ~/Dropbox -type f -exec cachedel '{}' \; &> /dev/null # if Dropbox is idle, remove sync'ed files from cache

https://github.com/Feh/nocache

http://packages.debian.org/search?keywords=nocache

http://packages.ubuntu.com/search?keywords=nocache

http://askubuntu.com/questions/122857

find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365
2013-12-11 14:51:53
User: cuberri
Functions: find
1

Useful when you want to cron a daily deletion task in order to keep files not older than one year. The command excludes .snapshot directory to prevent backup deletion.

One can append -delete to this command to delete the files :

find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365 -delete
find <PATH> -maxdepth 1 -type f -name "server.log*" -exec tar czPf '{}'.tar.gz --transform='s|.*/||' '{}' --remove-files \;
find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir/"; tar -g "$destdir"/"$tarfile".snar -czf "$destdir"/"$tarfile"_`date +%F`.tgz -P $d; done
find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir"; tar -czvf "$destdir"/"$tarfile"_`date +%F`.tgz -P $d; done
2013-12-05 19:18:03
User: jaimerosario
Functions: cut find read tar
1

Problem: I wanted to backup user data individually, using and incremental method. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...)

I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_2013-12-05.tgz" and "lucy_2013-12-05.tgz".

find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir/"; tar -czf $destdir/"$tarfile"_full.tgz -P $d; done
2013-12-05 19:07:17
User: jaimerosario
Functions: cut find read tar
1

Problem: I wanted to backup user data individually. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...)

I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_full.tgz" and "lucy_full.tgz".

cls && ipconfig | find "IPv4"
2013-12-03 13:08:01
User: dizzi90
Functions: find
2

May be useful to get user's ip address over the phone, as users struggle to read through a long ipconfig result.

find -type f -name '*.conf' -exec sed -Ei 's/foo/bar/' '{}' \;
2013-11-21 16:07:06
Functions: find sed
0

note that sed -i is non-standard (although both GNU and current BSD systems support it)

Can also be accomplished with

find . -name "*.txt" | xargs perl -pi -e 's/old/new/g'

as shown here - http://www.commandlinefu.com/commands/view/223/a-find-and-replace-within-text-based-files-to-locate-and-rewrite-text-en-mass.

find ~ -type f -size +500M -exec ls -ls {} \; | sort -n
2013-11-17 13:13:14
User: marcanuy
Functions: find ls sort
Tags: size find
-1

Find all files larger than 500M in home directory and print them ordered by size with full info about each file.

find /var/lib/cassandra/data -depth -type d -iwholename "*/snapshots/*" -printf "%Ty-%Tm-%Td %p\n" | sort
find /var/lib/cassandra/data -depth -type d -iwholename "*/snapshots/*" -mtime +30 -print0 | xargs -0 rm -rf
find -name .git -prune -o -type f -exec md5sum {} \; | sort -k2 | md5sum
find -type f | grep -v "^./.git" | xargs md5sum | md5sum
find . -type f -not -empty -printf "%-25s%p\n"|sort -n|uniq -D -w25|cut -b26-|xargs -d"\n" -n1 md5sum|sed "s/ /\x0/"|uniq -D -w32|awk -F"\0" 'BEGIN{l="";}{if(l!=$1||l==""){printf "\n%s\0",$1}printf "\0%s",$2;l=$1}END{printf "\n"}'|sed "/^$/d"
2013-10-22 13:34:19
User: alafrosty
Functions: awk cut find sed sort uniq xargs
1

* Find all file sizes and file names from the current directory down (replace "." with a target directory as needed).

* sort the file sizes in numeric order

* List only the duplicated file sizes

* drop the file sizes so there are simply a list of files (retain order)

* calculate md5sums on all of the files

* replace the first instance of two spaces (md5sum output) with a \0

* drop the unique md5sums so only duplicate files remain listed

* Use AWK to aggregate identical files on one line.

* Remove the blank line from the beginning (This was done more efficiently by putting another "IF" into the AWK command, but then the whole line exceeded the 255 char limit).

>>>> Each output line contains the md5sum and then all of the files that have that identical md5sum. All fields are \0 delimited. All records are \n delimited.

find *.less | xargs -I {} lessc {} {}.css && ls *.less.css | sed -e 'p;s/less.css/css/' | xargs -n2 mv
find garbage/ -type f -delete
2013-10-21 23:26:51
User: pdxdoughnut
Functions: find
-1

I _think_ you were trying to delete files whether or not they had spaces. This would do that. You should probably be more specific though.

find /Users/jpn/.ievms/ -type f -print0| xargs -0 du -sh
2013-10-16 09:54:19
Functions: du find xargs
-3

When you do a ls -1 | xargs rm it wouldn't workd because those files have spaces. So you must use

find -print0 and xargs -0

find . -user root
find -size +100M
find -L /home/sonic/archive -name '*gz' -type f
2013-10-07 14:32:22
User: sonic
Functions: find
Tags: find
-1

If /home/sonic/archive/ was a symlink to /backup/sonic/archive it would follow the links and give you the file listing. By default find will NOT follow symbolic links. The default behavior for the find command is to treat the symlinks as literal files.

I discovered this when trying to write a script run via cron to delete files with a modification time older than X days. The easiest solution was to use:

/usr/bin/find -L /home/sonic/archive -name '*gz' -type f -mtime +14 -exec rm '{}' \;

find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
2013-10-03 21:58:51
User: fuats
Functions: find head ls xargs
1

Sorts by latest modified files by looking to current directory and all subdirectories

find .git/objects -type f -printf "%P\n" | sed s,/,, | while read object; do echo "=== $obj $(git cat-file -t $object) ==="; git cat-file -p $object; done
for AAA in $(find /usr/local/bin -type l); do ls -gG "${AAA}"; done
2013-10-01 10:49:12
User: rgregor
Functions: find ls
0

Display a list of local shell scripts soft-linked to /usr/local/bin

Put local shell scripts to local ~/bin/ directory and soft-link them to /usr/local/bin/ which is in the $PATH variable to run them from anywhere.