Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using find from sorted by
Terminal - Commands using find - 1,039 results
find . -type f -printf "%T@ %Tc %p\n" |sort -n |cut -d' ' -f2- |tail -n20
find . | xargs perl -p -i.bak -e 's/oldString/newString/;'
2012-11-28 17:11:18
User: RedFox
Functions: find perl xargs
0

find . = will set up your recursive search. You can narrow your search to certain file by adding -name "*.ext" or limit buy using the same but add prune like -name "*.ext" -prune

xargs =sets it up like a command line for each file find finds and will invoke the next command which is perl.

perl = invoke perl

-p sets up a while loop

-i in place and the .bak will create a backup file like filename.ext.bak

-e execute the following....

's/ / /;' your basic substitute and replace.

find . -size 0c -print -exec rm -f {} \;
find . \( -name \*.cgi -o -name \*.txt -o -name \*.htm -o -name \*.html -o -name \*.shtml \) -print | xargs grep -s pattern
find . -type f -exec grep -ils stringtofind {} +
find . -name "*" -print | xargs grep -s pattern
for I in $(find . -depth -type d -not -path "*/.svn*" -print) ; do N="$(ls -1A ${I} | wc -l)"; if [[ "${N}" -eq 0 || "${N}" -eq 1 && -n $(ls -1A | grep .svn) ]] ; then svn rm --force "${I}"; fi ; done
find . -type f -regex '.*html$' -exec sed -i 's/\xEF\xBB\xBF//' '{}' \;
find / -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l
find . -type f -exec grep -l "some string" {} \;
mplayer $(find . -iname '*.avi' | shuf -n1)
find . -type f |egrep '^./.*\.' |sed -e "s/\(^.*\.\)\(.*$\)/\2/" |sort |uniq
2012-11-12 17:17:55
User: dvst
Functions: egrep find sed sort
0

find files recursively from the current directory, and list the extensions of files uniquely

find . -maxdepth 2 -type d -name '.git' -print0 | while read -d ''; do (cd "$REPLY"; git gc); done
2012-11-07 08:38:33
User: unhammer
Functions: cd find read
Tags: git drivespace
-1

Assumes you've cd'd to the folder in which all your git repos reside; you could run it from ~ without -maxdepth, although that might make find take quite a while longer.

If you have several processor cores, but not that much ram, you might want to run

git config --global pack.threads 1

first, since gc-ing can eat lots of ram.

find . -printf "touch -m -d \"%a\" '%p'\n" | tee /tmp/retime.sh
2012-11-05 20:32:05
User: dmmst19
Functions: find tee
4

Sometimes when copying files from one place to another, the timestamps get lost. Maybe you forgot to add a flag to preserve timestamps in your copy command. You're sure the files are exactly the same in both locations, but the timestamps of the files in the new home are wrong and you need them to match the source.

Using this command, you will get a shell script (/tmp/retime.sh) than you can move to the new location and just execute - it will change the timestamps on all the files and directories to their previous values. Make sure you're in the right directory when you launch it, otherwise all the touch commands will create new zero-length files with those names. Since find's output includes "." it will also change the timestamp of the current directory.

Ideally rsync would be the way to handle this - since it only sends changes by default, there would be relatively little network traffic resulting. But rsync has to read the entire file contents on both sides to be sure no bytes have changed, potentially causing a huge amount of local disk I/O on each side. This could be a problem if your files are large. My approach avoids all the comparison I/O. I've seen comments that rsync with the "--size-only" and "--times" options should do this also, but it didn't seem to do what I wanted in my test. With my approach you can review/edit the output commands before running them, so you can tell exactly what will happen.

The "tee" command both displays the output on the screen for your review, AND saves it to the file /tmp/retime.sh.

Credit: got this idea from Stone's answer at http://serverfault.com/questions/344731/rsync-copying-over-timestamps-only?rq=1, and combined it into one line.

find . -type f -print | awk -F'.' '{print $NF}' | sort | uniq -c
find /test -type f -printf "%AY%Aj%AH%AM%AS---%h/%f\n" | sort -n
find -maxdepth 3 -type d | while read -r dir; do printf "%s:\t" "$dir"; find "$dir" | wc -l; done
2012-10-15 15:00:09
User: brainstorm
Functions: find printf read wc
1

Counts the files present in the different directories recursively. One only has to change maxdepth to have further insight in the directory hierarchy.

Found at unix.stackexchange.com:

http://unix.stackexchange.com/questions/4105/how-do-i-count-all-the-files-recursively-through-directories

find /var/cache/apt -not -mtime -7 | sudo xargs rm
find /path/to/search -xtype l
find . -name '*.rar' -execdir unrar e {} \;
2012-09-27 02:27:03
User: kyle0r
Functions: find
7

From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd.

A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive.

Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?

find . -type d -maxdepth 1 | xargs du -sh
find site/ -type d | xargs sudo chmod 755
find ./ -type f | xargs sudo chmod 644
find /var/cache/pacman/pkg -not -mtime -7 | sudo xargs rm
2012-09-20 12:36:44
User: brejktru
Functions: find sudo xargs
1

Sometimes my /var/cache/pacman/pkg directory gets quite big in size. If that happens I run this command to remove old package files. Packages that we're upgraded in last N days are kept in case you are forced to downgrade a specific package. The command is obviously Arch Linux related.