Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,860 results
iptables -A INPUT -s 65.55.44.100 -j DROP
ls -trF | grep -v \/ | tail -n 1
2011-09-14 20:05:37
User: mrpollo
Functions: grep ls tail
Tags: find stat mtime
-1

Sort by time and Reverse to get Ascending order, then display a marker next to the a file, negate directory and select only 1 result

sudo systemctl enable lxdm
curl ifconfig.me/all/xml
2010-04-21 20:45:17
User: truemilk
-1

Request all information about my IP address in xml format

find . -type f -iname "*.mp3" -exec id3v2 --delete-all {} \;
LANG=fr_FR.iso8859-1 find . -name '*['$'\xe9'$'\xea'$'\xeb'$'\xc9'']*'|while read f; do a="$(echo $f|iconv -f iso8859-1 -t ascii//TRANSLIT)"; echo "move $f => $a"; done
2011-04-06 17:03:31
User: gibboris
Functions: echo find read
-1

Warn: use convmv or detox if you can: they are the right tools.

But if you want to do it manually, you can use this command to find the problematic files and transliterate their accented characters to their ascii equivalent.

(Useful when doing cd backup: growisofs may fail on files which come from the old iso8859-* days.)

find . -type f -exec md5 '{}' ';' | sort | uniq -f 3 -d | sed -e "s/.*(\(.*\)).*/\1/"
2012-01-14 08:54:12
User: noahspurrier
Functions: find sed sort uniq
-1

This works on Mac OS X using the `md5` command instead of `md5sum`, which works similarly, but has a different output format. Note that this only prints the name of the duplicates, not the original file. This is handy because you can add `| xargs rm` to the end of the command to delete all the duplicates while leaving the original.

sed -i 's/$/\r/' file
2012-02-23 08:34:30
User: evolix
Functions: sed
-1

This permit to convert an UNIX file to DOS file.

You can use it in a loop to convert multiple files, like :

for i in *.bat; do sed -i 's/$/\r/' $i; done

find . -name "*.jpg" | perl -ne'chomp; $name = $_; $quote = chr(39); s/[$quote\\!]/_/ ; print "mv \"$name\" \"$_\"\n"'
sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'|xargs -r tail -f
2010-07-30 18:20:00
User: vutcovici
Functions: echo eval grep ls sed sudo tail xargs
-1

Tail all logs that are opened by all java processes. This is helpful when you are on a new environment and you do not know where the logs are located. Instead of java you can put any process name. This command does work only for Linux.

The list of all log files opened by java process:

sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'
mencoder FILENAME.3gp -ovc lavc -lavcopts vcodec=msmpeg4v2 -oac mp3lame -lameopts vbr=3 -o FILENAME.avi
p=~/.config/chromium/zed; cp -r ~/.config/chromium/Default $p && echo "chromium-browser --user-data-dir=$p" && chromium-browser --user-data-dir=$p;
2010-11-08 02:45:29
User: zed
Functions: cp echo
-1

Change the value of p to match the path where you wish to create the profile.

To run it again in the future, use the parameter --user-data-dir (which gets echoed to you when run):

chromium-browser --user-data-dir=/path/to/your/

Quick Functions:

# create a new chromium profile

new-chromium-profile() { p=~/.config/chromium/$1; cp -r ~/.config/chromium/Default $p && echo "chromium-browser --user-data-dir=$p" && chromium-browser --user-data-dir=$p; }

# runs a chromium profile

run-chromium-profile() { chromium-browser --user-data-dir=~/.config/chromium/$1; }

now=`date +"%Y/%m/%d" -d "04/02/2005"` ; end=`date +"%Y/%m/%d" -d "07/31/2005"`; while [ "$now" != "$end" ] ; do now=`date +"%Y/%m/%d" -d "$now + 1 day"`; echo "$now"; done
sed -i 's/`head -n 500 foo.log`//' foo.log
2011-05-23 09:41:35
User: kevinquinnyo
Functions: sed
-1

This is good for cleaning up log files without having to erase the entire contents of the file, and allows you to keep the most recent entries to the log only

wget ifconfig.me/ip -q -O -
msgfilter --keep-header -i input.po -o empty.po awk -e '{}'
2012-01-14 13:29:26
User: unhammer
Functions: awk
-1

basically create a .pot file from a po-file, ready for translating

FLOOR=0; RANGE=10; number=0; while [ "$number" -le $FLOOR ]; do number=$RANDOM; let "number %= $RANGE"; done; echo $number
2009-02-20 09:33:56
User: raphink
Functions: echo
Tags: bash
-1

This one-liner outputs a random number between the values given for FLOOR and RANGE.

getdji (){local url sedcmd;url='http://finance.yahoo.com/q?d=t&s=^DJI';sedcmd='/(DJI:.*)/,/Day.*/!d;s/^ *//g;';sedcmd="$sedcmd/Change:/s/Down / -/;/Change:/s/Up / +/;";sedcmd="$sedcmd/Open:/s//& /";lynx -dump "$url" | sed "$sedcmd"; }
find . -iname '*TODO*'
find ./ $1 -name "* *" | while read a ; do mv "${a}" "${a//\ /_}" ; done
for vm in `/usr/bin/vmware-cmd -l`; do /usr/bin/vmware-cmd "${vm}" stop trysoft; done
2011-09-15 06:56:49
User: maxheadroom
-1

This command will shutdown all VMs on an VMWare ESX host. First it tries to gracefully shutdown the VM. If that fails it will hard shutdown and the power off.

su <username>
which somecommand
2009-02-05 10:29:04
User: chrisdrew
Functions: which
-1

Returns the pathnames of the files which would be executed in the current environment had its arguments been given as a command.

cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
2009-08-05 13:33:06
Functions: cat
-1

If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem.

Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading.

{ u="http://twitter.com/commandlinefu"; echo "Subject: $u"; echo "Mime-Version: 1.0"; echo -e "Content-Type: text/html; charset=utf-8\n\n"; curl $u ; } | sendmail recipient@example.com
2010-02-24 04:18:30
User: pascalv
Functions: echo sendmail
-1

This will send the web page at $u to recipient@example.com . To send the web page to oneself, recipient@example.com can be replaced by $(whoami) .

The "charset" is UTF-8 here, but any alternative charset of your choice would work.

`wget -O - -o /dev/null $u` may be considered instead of `curl $u` .

On some systems the complete path to sendmail may be necessary, for instance /sys/pkg/libexec/sendmail/sendmail for some NetBSD.