Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,925 results
for i in $(chkconfig --list | grep "4:on" | awk {'print $1'}); do chkconfig --level 4 "$i" off; done
2011-01-25 03:54:43
User: m1cawber
Functions: awk chkconfig grep
-1

works in fedora, centos and presumably other distros that use chkconfig

/System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/LaunchServices.framework/Versions/A/Support/lsregister -kill -r -domain local -domain system -domain user
watch -n 0.5 ssh [user]@[host] mysqladmin -u [mysql_user] -p[password] processlist | tee -a /to/a/file
2009-08-19 14:21:27
User: lunarblu
Functions: ssh tee watch
-1

Locally watch MySQL process list update every 5s on a remote host. While you watch pipe to a file. The file out put is messy though but hey at least you have a history of what you see.

genpass() { local h x y;h=${1:-8};x=( {a..z} {A..Z} {0..9} );y=$(echo ${x[@]} | tr ' ' '\n' | shuf -n$h | xargs);echo -e "${y// /}"; }
2009-10-24 04:05:42
User: twfcc
Functions: echo tr
-1

make password randomly, default 8 char

gwenview `wget -O - http://xkcd.com/ | grep 'png' | grep '<img src="http://imgs.xkcd.com/comics/' | sed s/title=\".*//g | sed 's/.png\"/.png/g' | sed 's/<img src=\"//g'`
2010-08-24 22:21:51
User: hunterm
Functions: grep sed
-1

Output the html from xkcd's index.html, filter out the html tags, and then view it in gwenview.

/System/Library/CoreServices/Menu\ Extras/User.menu/Contents/Resources/CGSession -suspend
curl "http://www.house.gov/house/MemberWWW.shtml" 2>/dev/null | sed -e :a -e 's/<[^>]*>//g;/</N;//ba' | perl -nle 's/^\t\t(.*$)/ $1/ and print;'
2009-09-24 23:37:36
User: drewk
Functions: perl sed
Tags: perl sed curl
-1

Uses curl to download page of membership of US Congress. Use sed to strip HTML then perl to print a line starting with two tabs (a line with a representative)

add-apt-repository [REPOSITORY]
awk -F $'\t' '{printf $1 LS $2 LS $3 LS $4 LS $5; for (i = 7; i < NF; i++) printf $i "\t"; printf "\n--\n";}' LS=$'\n' 'Ad report.tsv' | column -t -s $'\t'
2011-02-28 10:52:16
User: zhangweiwu
Functions: awk column printf
-1

The exported TSV file of Google Adwords' first five columns are text, they usually should collapse into one cell, a multi-line text cell, but there is no guaranteed way to represent line-break within cells for .tsv file format, thus Google split it to 5 columns.

The problem is, with 5 columns of text, there are hardly space to put additional fields while maintain printable output.

This script collapses the first five columns of each row into one single multi-line text cell, for console output or direct send to printer.

find ~ -name '*.sqlite' -exec sqlite3 '{}' 'VACUUM;' \;
wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
find garbage/ -type f -delete
2013-10-21 23:26:51
User: pdxdoughnut
Functions: find
-1

I _think_ you were trying to delete files whether or not they had spaces. This would do that. You should probably be more specific though.

wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | tail -n4
2009-09-25 02:36:46
User: dakunesu
Functions: sed tail wget
-1

"get Hong Kong weather infomation from HK Observatory

From Hong Kong Observatory wap site ;)"

other one showed alot of blank lines for me

find -maxdepth 1 -type f -name "*.7z" -exec 7zr e '{}' ';'
2010-01-23 19:50:10
User: minnmass
Functions: find
-1

Use find's built-in ability to call programs.

Alternatively,

find -maxdepth 1 -type f -name "*.7z" -print0 | xargx -0 -n 1 7zr e

would work, too.

setserial -g /dev/ttyS[0-9]* | grep -v "unknown"
test $((RANDOM%2)) -eq 0
sudo add-apt-repository ppa:PPA_TO_ADD
2014-04-24 20:02:39
User: KlfJoat
Functions: sudo
-1

There is no longer a need to add PGP keys for Ubuntu Launchpad PPA's.

The add-apt-repository command creates a new file for the PPA in /etc/sources.list.d/ then adds the PPA's keys to the apt keyring automatically. No muss, no fuss.

perl -pi -e "s/\r/\n/g" <file>
2010-07-29 16:07:36
User: din7
Functions: perl
-1

Replace DOS character ^M with newline using perl inline replace.

eog `curl 'http://xkcd.com/' | awk -F "ng): |</h" '/embedding/{print $2}'`
curl ifconfig.me
wget --spider -o wget.log -e robots=off --wait 1 -r -p http://www.example.com
2011-04-05 13:42:14
User: lele
Functions: wget
-1

This will visit recursively all linked urls starting from the specified URL. It won't save anything locally and it will produce a detailed log.

Useful to find broken links in your site. It ignores robots.txt, so just use it on a site you own!

find . -name '*.java' -o -name '*.xml' | grep -v '\.svn' | xargs wc -l
2011-06-30 12:45:40
User: ewilson
Functions: find grep wc xargs
Tags: find grep wc
-1

There's nothing particularly novel about this combination of find, grep, and wc, I'm just putting it here in case I want it again.

sudo port installed | grep -v 'active\|The' | xargs sudo port uninstall
squidclient mgr:info | grep "file desc"
2010-07-29 17:35:20
User: KoRoVaMiLK
Functions: grep
-1

Shows useful informations about file descriptors in Squid web proxy

makeself <archive_dir> <file_name> <label>
2012-01-10 18:08:50
User: totti
-1

Used by virtualbox and others to create '.run' file.