Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,585 results
find . -exec grep foobar /dev/null {} \; | awk -F: '{print $1}' | xargs vi
mp32ogg file.mp3
2009-11-16 20:22:48
User: nickleus
-1

why would you want to convert mp3's to ogg? 1 reason is because ardour doesn't support mp3 files because of legal issues. this is really the only reason you would do this, unless you have really bad hearing and also want smaller file sizes, because converting from one lossy format to another isn't a good idea.

jot -b '#' -s '' $COLUMNS
2010-04-13 22:03:39
User: dennisw
Tags: tr tput printf
-1

For BSD-based systems, including OS X, that don't have seq.

This version provides a default using tput in case $COLUMNS is not set:

jot -b '#' -s '' ${COLUMNS:-$(tput cols)}
find / -type f -size +512000 | xargs ls -lh | awk '{ print $5 " " $6$7 ": " $9 }'
2010-05-12 17:21:12
User: johnss
Functions: awk find ls xargs
-1

This is an updated version that some one provided me via another "find" command to find files over a certain size. Keep in mind you may have to mess around with the print values depending on your system to get the correct output you want. This was tested on FC and Cent based servers. (thanks to berta for the update)

net user USERNAME /domain
wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
2009-07-02 01:46:21
User: bbelt16ag
Functions: wget
-1

just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.

cat /proc/cpuinfo
ash prod<tab>
2012-05-12 19:51:02
User: c3w
-1

http://github.com/c3w/ash

. a Ruby SSH helper script

. reads a JSON config file to read host, FQDN, user, port, tunnel options

. changes OSX Terminal profiles based on host 'type'

USAGE:

put 'ash' ruby script in your PATH

modify and copy ashrc-dist to ~/.ashrc

configure OSX Terminal profiles, such as "webserver", "development", etc

run "ash myhostname" and away you go!

v.2 will re-attach to a 'screen' named in your ~/.ashrc

ls
curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq
-1

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

find . -name "*.php" -exec grep -il searchphrase {} \;
2010-01-16 05:09:30
Functions: find grep
-1

This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.

du -s * | sort -nr | head
ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
2009-09-03 17:55:26
User: arcege
Functions: find grep xargs
Tags: vim find grep
-1

Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.

pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
2010-05-13 14:44:03
User: KoRoVaMiLK
-1

Useful since

"export http_proxy=blahblah:8080"

doesn't seem to work with pear

aptitude show $PROGRAM | grep Vers
2009-02-27 23:24:37
User: aabilio
Functions: grep
-1

Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers

Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.

xrandr -q | grep -w Screen
file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
2010-09-24 19:44:32
User: damncool
Functions: file sed seq
-1

splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ...

The psselect commad is part of the psutils package

watch -n 10 free -m
2014-01-04 10:10:15
User: Darkstar
Functions: free watch
-1

This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval.

readlink -f /proc/<pid>/cmdline
2009-05-26 10:09:03
User: naseer
Functions: readlink
-1

Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path

for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
2010-03-20 16:11:12
User: hfs
Functions: mv
-1

This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.

ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
netstat -4tnape
wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
2010-01-17 11:25:47
User: oshazard
Functions: cd sed tar wget
-1

Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion.

1.) wget source.tar.gz

2.) tar xzvf source.tar.gz

3.) cd source

4.) ls

From there you can run ./configure, make and etc.

split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split
-1

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)