Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands by hfs from sorted by
Terminal - Commands by hfs - 18 results
ack --java '\\u.?.?.?[^0-9a-fA-F]'
watch() { while true; do echo "<Ctrl+V><Ctrl+L>Every 2.0s: $@"; date; eval "$@"; sleep 2; done }
2012-03-07 09:30:15
User: hfs
Functions: echo eval sleep watch
Tags: watch
0

Usage:

watch ls -l

Basic but usable replacement for the "watch" command for those systems which don't have it (e.g. the Solaris I'm trapped on).

Type Ctrl+V to escape the following Ctrl+L which clears the screen. It will be displayed as "^L".

v () { echo "$@"; "$@"; }
2011-10-13 11:33:19
User: hfs
Functions: echo
-3

You can use this in shell scripts to show which commands are actually run. Just prepend every "critical line" with "v˽".

$TMP=/tmp

echo "Let me create a directory for you"

v mkdir $TMP/new

In scripts this can be more useful than "set -x", because that can be very verbose with variable assignments etc.

Another nice use is if you prepend every "critical" command with "v", then you can test your script by commenting out the actual execution.

if [ -z "${BASH_VERSINFO}" ] || [ -z "${BASH_VERSINFO[0]}" ] || [ ${BASH_VERSINFO[0]} -lt 4 ]; then echo "This script requires Bash version >= 4"; exit 1; fi
2011-02-25 11:02:47
User: hfs
Functions: echo exit
Tags: bash version
0

If you use new features of a certain Bash version in your shell script, make sure that it actually runs with the required version.

sqlite3 -csv ~/.thunderbird/*.default/calendar-data/local.sqlite "SELECT CASE WHEN priority IS NULL THEN 5 ELSE priority END AS priority, title FROM cal_todos WHERE ical_status IS NULL ORDER BY priority ASC, last_modified DESC;"
tail -f file | while read line; do echo -n $(date -u -Ins); echo -e "\t$line"; done
2010-11-19 10:01:57
User: hfs
Functions: date echo file read tail
Tags: tail date
6

This is useful when watching a log file that does not contain timestamps itself.

If the file already has content when starting the command, the first lines will have the "wrong" timestamp when the command was started and not when the lines were originally written.

od -N 4 -t uL -An /dev/random | tr -d " "
2010-11-09 07:57:16
User: hfs
Functions: od tr
Tags: random
2

Reads 4 bytes from the random device and formats them as unsigned integer between 0 and 2^32-1.

to() { eval dir=\$$1; cd "$dir"; }
2010-10-15 13:40:35
User: hfs
Functions: cd eval
4

Set a bookmark as normal shell variable

p=/cumbersome/path/to/project

To go there

to p

This saves one "$" and is faster to type ;-) The variable is still useful as such:

vim $p/<TAB>

will expand the variable (at least in bash) and show a list of files to edit.

If setting the bookmarks is too much typing you could add another function

bm() { eval $1=$(pwd); }

then bookmark the current directory with

bm p
tail -f /var/log/messages | while read line; do accu="$line"; while read -t 1 more; do accu=`echo -e "$accu\n$more"`; done; notify-send "Syslog" "$accu"; done
2010-10-10 16:28:08
User: hfs
Functions: read tail
1

The given example collects output of the tail command: Whenever a line is emitted, further lines are collected, until no more output comes for one second. This group of lines is then sent as notification to the user.

You can test the example with

logger "First group"; sleep 1; logger "Second"; logger "group"
sftp-cp() { for each in "$@"; do echo "put \"$each\" \".$each\""; echo "rename \".$each\" \"$each\""; done };
2010-05-12 13:13:51
User: hfs
Functions: echo sftp
Tags: sftp
1

Usage:

sftp-cp * | sftp user@host:/dir

This is useful if there is a process on the remote machine waiting for files in an incoming directory. This way it won't see half-transmitted files if it ignores hidden files.

for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
2010-03-20 16:11:12
User: hfs
Functions: mv
-1

This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.

perl -ne 'BEGIN{undef $/}; print "$ARGV\t$.\t$1\n" if m/(first line.*\n.*second line)/mg'
2010-03-18 15:46:10
User: hfs
Functions: perl
Tags: perl grep
7

Using perl you can search for patterns spanning several lines, a thing that grep can't do. Append the list of files to above command or pipe a file through it, just as with regular grep. If you add the 's' modifier to the regex, the dot '.' also matches line endings, useful if you don't known how many lines you need are between parts of your pattern. Change '*' to '*?' to make it greedy, that is match only as few characters as possible.

See also http://www.commandlinefu.com/commands/view/1764/display-a-block-of-text-with-awk to do a similar thing with awk.

Edit: The undef has to be put in a begin-block, or a match in the first line would not be found.

dir=$(pwd); while [ ! -z "$dir" ]; do ls -ld "$dir"; dir=${dir%/*}; done; ls -ld /
2009-12-14 14:38:11
User: hfs
Functions: dir ls
2

Useful if a different user cannot access some directory and you want to know which directory on the way misses the x bit.

perl -ne '$pkg=$1 if m/^Package: (.*)/; print "$1\t$pkg\n" if m/^Installed-Size: (.*)/;' < /var/lib/dpkg/status | sort -rn | less
2009-10-19 12:55:59
User: hfs
Functions: perl sort
0

List packages and their disk usage in decreasing order. This uses the "Installed-Size" from the package metadata. It may differ from the actual used space, because e.g. data files (think of databases) or log files may take additional space.

tr -c -d 0-9 < /dev/urandom | head -c 10
mmv 'banana_*_*.asc' 'banana_#2_#1.asc'
2009-10-01 13:49:40
User: hfs
Tags: rename
9

Use 'mmv' for mass renames. The globbing syntax is intuitive.

<Shift + W>
2009-09-23 13:51:22
User: hfs
Tags: top
13

'top' has fancy layout modes where you can have several windows with different things displayed. You can configure a layout and then save it with 'W'. It will then be restored every time you run top.

E.g. to have two colored windows, one sorted by CPU usage, the other by memory usage, run top

top

then press the keys

<A> <z> <a> <-> <a> <z> <a> <-> <a>

and then as you don?t want to repeat this the next time:

<W>
for each in `cut -d " " -f 1 inputfile.txt`; do echo "select * from table where id = \"$each\";"; done
2009-09-23 13:29:16
User: hfs
Functions: echo
Tags: echo cut for-each
0

I never can remember the syntax of awk. You can give a different -d option to cut to separate by e.g. commas. Also this allows to do more things with the generated SQL, e.g. to redirect it into different files.