Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,926 results
find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
2009-09-03 17:55:26
User: arcege
Functions: find grep xargs
Tags: vim find grep
-1

Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.

pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
2010-05-13 14:44:03
User: KoRoVaMiLK
-1

Useful since

"export http_proxy=blahblah:8080"

doesn't seem to work with pear

aptitude show $PROGRAM | grep Vers
2009-02-27 23:24:37
User: aabilio
Functions: grep
-1

Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers

Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.

xrandr -q | grep -w Screen
file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
2010-09-24 19:44:32
User: damncool
Functions: file sed seq
-1

splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ...

The psselect commad is part of the psutils package

watch -n 10 free -m
2014-01-04 10:10:15
User: Darkstar
Functions: free watch
-1

This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval.

readlink -f /proc/<pid>/cmdline
2009-05-26 10:09:03
User: naseer
Functions: readlink
-1

Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path

for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
2010-03-20 16:11:12
User: hfs
Functions: mv
-1

This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.

ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
netstat -4tnape
wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
2010-01-17 11:25:47
User: oshazard
Functions: cd sed tar wget
-1

Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion.

1.) wget source.tar.gz

2.) tar xzvf source.tar.gz

3.) cd source

4.) ls

From there you can run ./configure, make and etc.

split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split
-1

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)

if [ -x /etc/*-release ]; then cat /etc/*-release ; else cat /etc/*-version ; fi
rsync -P -e 'ssh -p PORT' SRC DEST
2011-10-13 08:59:07
User: vickio
Functions: rsync
Tags: ssh rsync
-1

Transfer files with rsync over ssh on a non-standard port, showing a progress bar and resuming partial transfers.

perl -e 'print "$_=$ENV{$_}\n" for keys %ENV'
find . -name "*~" -exec rm {} \;
echo -e "12 morning\n15 afternoon\n24 evening" | awk '{if ('`date +%H`' < $1) print "Good " $2}'
tr '[:upper:]' '[:lower:]' < input.txt > output.txt
sudo curl "http://hg.mindrot.org/openssh/raw-file/c746d1a70cfa/contrib/ssh-copy-id" -o /usr/bin/ssh-copy-id && sudo chmod 755 /usr/bin/ssh-copy-id
2012-02-09 20:29:24
User: misterich
Functions: chmod sudo
-1

Mac install ssh-copy-id

From there on out, you would upload keys to a server like this:

(make sure to double quote the full path to your key)

ssh-copy-id -i "/PATH/TO/YOUR/PRIVATE/KEY" username@server

or, if your SSH server uses a different port (often, they will require that the port be '2222' or some other nonsense:

(note the double quotes on *both* the "/path/to/key" and "user@server -pXXXX"):

ssh-copy-id -i "/PATH/TO/YOUR/PRIVATE/KEY" "username@server -pXXXX"

...where XXXX is the ssh port on that server

watch -n 7 -d 'uptime | sed s/.*users?, //'
odmget -q "attribute=unique_id" CuAt |sed -n 's/.*name = "\(.*\)"/\1/p;s/.*value = "..........\(....\)..SYMMETRIX..EMCfcp.*"/0x\1/p;s/.*value =//p'
setopt correct
2012-09-11 01:47:20
User: evandrix
Tags: zsh
-1

zsh has a powerful correction mechanism. If you type a command in the wrong way it suggests corrections. What happend here is that dir is an unknown command and zsh suggests gdir, while maybe ls was what you wanted.

If you want to execute gdir hit y (yes)

If you want to try to execute dir anyway hit n (no)

If you want to execute completely different spelt command like ls hit a (abort) and type your command

If you want to execute a similar spelt commant like udir hit e (edit) and edit your command.

Limit kernel compilation load
for output in $(find . ! -name movie.nfo -name "*.nfo") ; do rm $output ; done
2014-04-01 17:41:50
User: analbeard
Functions: find rm
-1

Finds all nfo files without the filename movie.nfo and deletes them.

grep -r -l xxxxx . | xargs perl -i -pe "s/xxxxx/yyyyy/g"
2009-02-06 08:18:50
User: hassylin
Functions: grep perl xargs
-1

This script first find all files which contains word xxxxx recursively. Then replace the word xxxxx to yyyyy of the files.

Use case:

- Web site domain change

- Function name change of the program