Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands tagged bash from sorted by
Terminal - Commands tagged bash - 724 results
if [[ ":$PATH:" != *":$dir:"* ]]; then PATH=${PATH}:$dir; fi
2013-08-11 01:19:13
User: dmmst19
Tags: bash PATH $PATH
9

Sometimes in a script you want to make sure that a directory is in the path, and add it in if it's not already there. In this example, $dir contains the new directory you want to add to the path if it's not already present.

There are multiple ways to do this, but this one is a nice clean shell-internal approach. I based it on http://stackoverflow.com/a/1397020.

You can also do it using tr to separate the path into lines and grep -x to look for exact matches, like this:

if ! $(echo "$PATH" | tr ":" "\n" | grep -qx "$dir") ; then PATH=$PATH:$dir ; fi

which I got from http://stackoverflow.com/a/5048977.

Or replace the "echo | tr" part with a shell parameter expansion, like

if ! $(echo "${PATH//:/$'\n'}" | grep -qx "$dir") ; then PATH=$PATH:$dir ; fi

which I got from http://www.commandlinefu.com/commands/view/3209/.

There are also other more regex-y ways to do it, but I find the ones listed here easiest to follow.

Note some of this is specific to the bash shell.

open() { explorer /e, $(cygpath -wap "${1:-$PWD}"); }
2013-08-08 14:49:15
User: applemcg
0

use the shell default positional parameter syntax ${X:-default} in lieu of testing.

$ ps -LF -u user
2013-08-06 21:50:48
User: jld
Functions: ps
Tags: bash processes
0

Piping ps into grep is mostly useless: ps has its own filter options like -u and -C

for i in '/tmp/file 1.txt' '/tmp/file 2.jpg'; do ln -s "$i" "$i LINK"; done
2013-08-02 08:30:50
User: qwertyroot
Functions: ln
0

Replace

'/tmp/file 1.txt' '/tmp/file 2.jpg'

with

"$NAUTILUS_SCRIPT_SELECTED_FILE_PATHS"

for Nautilus script

Or with

%F

for Thunar action

If you linking the symlinks itself, but want to link to source files instead of symlinks, use

"`readlink -m "$i"`"

instead of

"$i"

like this:

for i in '/tmp/file 1.txt' '/tmp/file 2.jpg'; do ln -s "`readlink -m "$i"`" "$i LINK"; done

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

for fil in *.JPG; do datepath="$(identify -verbose $fil | grep DateTimeOri | awk '{print $2"_"$3 }' | sed s%:%_%g)"; mv -v $fil $datepath.jpg; done
2013-08-02 01:42:04
Functions: mv
0

Requires ImageMagick.

Extracts date taken from image and renames it properly.

Based on StackOverflow answer.

rhost() { if [[ $1 =~ ^[0-9]+$ ]]; then sed -i "$1"d ${HOME}/.ssh/known_hosts; else echo "rhost [n]"; fi }
2013-08-01 21:10:34
User: lowjax
Functions: echo sed
-1

Quickly remove the conflicting line (key) from current users known_hosts file when there is an SSH host conflict. Very nice when you get tired of writing out full commands. Ideally you would place this into your .bash_profile

Usage: rhost [n]

Example: rhost 33 (removes line 33 from ~/.ssh/known_hosts)

Function assumes the $HOME exists, you could alternatively use "~/.ssh/known_hosts"

Mac OSX likes a space for sed -i "$1" d

open(){ if [[ -n "$1" ]];then explorer /e, $(cygpath -mal "$PWD/$1");else explorer /e, $(cygpath -mal "$PWD");fi }
2013-07-31 01:15:14
User: lowjax
1

This alternative either opens the current working directory by just issuing the open function in the commandline. Or you can specify what directory you would like to open.

Example: open /cygdrive/c/Windows

Usage: open [path]

When no option is specified it will open the current working directory

for m in `df -P | awk -F ' ' '{print $NF}' | sed -e "1d"`;do n=`df -P | grep "$m$" | awk -F ' ' '{print $5}' | cut -d% -f1`;i=0;if [[ $n =~ ^-?[0-9]+$ ]];then printf '%-25s' $m;while [ $i -lt $n ];do echo -n '=';let "i=$i+1";done;echo " $n";fi;done
2013-07-29 20:12:39
User: drockney
Functions: awk cut echo grep printf sed
Tags: bash
5

Automatically drops mount points that have non-numeric sizes (e.g. /proc). Tested in bash on Linux and AIX.

echo -e "\e[3$(( $RANDOM * 6 / 32767 + 1 ))mHello World!"
2013-07-28 13:01:12
User: nst
Functions: echo
Tags: bash color random
0

The expression $(( $RANDOM * 6 / 32767 + 1 )) generates a random number between 1 and 6, which is then inserted into the escape sequence \e[3_m to switch the foreground color of the terminal to either red, green, yellow, blue, purple or cyan.

The color can be reset using the escape sequence \e[0m.

The full list of colors can be found here: https://wiki.archlinux.org/index.php/Color_Bash_Prompt#List_of_colors_for_prompt_and_Bash

for i in {1..31}; do ls -1 *${YYYY}${MM}`printf "%02d" $i`* | wc -l; done
2013-07-26 07:08:04
User: Paulus
Functions: ls wc
Tags: bash Linux
0

RU: Найдет число файлов в папке по данной маске в цикле по дням месяца

du -m --max-depth=1 [DIR] | sort -nr
ps -eLF | grep ^user
2013-07-24 09:53:12
User: balsagoth
Functions: grep ps
Tags: bash processes
0

This shows all process (-e) and threads (-L) in full format (-F)

link=https://www.dropbox.com/login ; curl -b a -c cookie -d "t=$(curl -c a $link | sed -rn 's/.*TOKEN: "([^"]*).*/\1/p')&login_email=me%40yahoo.com&login_password=my_passwd" $link
2013-07-12 07:43:21
User: nixnax
Functions: link
1

Use the command line to log into Dropbox. You have to replace me@yahoo.com with your Dropbox email (note the URL-encoding of "@" as %40). Also replace my_passwd with your Dropbox password. (Note: special characters in your password (such as #) must be url-encoded. You will get a cookie (stored in file "cookie") that you can use for subsequent curl operations to dropbox, for example curl -b cookie https://www.dropbox.com/home. Debug note: If you want to see what data curl posts, use curl's --trace-ascii flag.

while curl -dsL example.com 2>&1 | grep 503;do sleep 8;done;echo server up
mogrify -resize SIZE_IN_PIXELS *.jpg
2013-07-05 14:14:04
User: o0110o
-1

Batch resize all images to a width of 'X' pixels while maintaing the aspect ratio.

This makes uses of ImageMagick to make life easier.

for y in {2009..2013}; do cal $y; done
find . -empty -type d -print0 | xargs -0 rmdir -p
2013-07-01 02:44:57
User: rafar
Functions: find rmdir xargs
0

It starts in the current working directory.

It removes the empty directory and its ancestors (unless the ancestor contains other elements than the empty directory itself).

It will print a failure message for every directory that isn't empty.

This command handles correctly directory names containing single or double quotes, spaces or newlines.

If you do not want only to remove all the ancestors, just use:

find . -empty -type d -print0 | xargs -0 rmdir
find . -type f ! -path \*CVS\* -exec rm {} \; -exec cvs remove {} \;
2013-06-28 20:17:40
User: jasonsydes
Functions: cvs find rm
Tags: bash cvs delete rm
0

This command removes and then cvs removes all files in the current directory recursively.

while true; do curl -vsL -o /dev/null example.com 2>&1 | grep 503 > /dev/null || echo "OK: server is up."; sleep 8; done
echo $[RANDOM % 2]
echo $[RANDOM % 100] # range 0-99
2013-05-25 19:02:00
User: anapsix
Functions: echo
-2

use it to stagger cronjob or to get a random number

increase the range by replacing 100 with your own max value

du -mx [directory] | grep -P '^\d{4}' | sort -rn
2013-05-24 09:52:41
User: mc0e
Functions: du grep sort
Tags: bash Linux du
0

I don't like doing a massive sort on all the directory names just to get a small set of them. the above shows a sorted list of all directories over 1GB. use head as well if you want.

du's "-x" flag limits this to one file system. That's mostly useful when you run it on "/" but don't want "/proc" and "/dev" and so forth. Remember though that it will also exclude "/home" or "/var" if those are separate partitions.

the "-a" option is often useful too, for listing large files as well as large directories. Might be slower.

du -xB M --max-depth=2 /var | sort -rn | head -n 15
curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq
-1

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

=() { echo $(($*)); }
2013-05-03 04:27:07
User: xlz
Functions: echo
3

POSIX compliant arithmetic evaluation.

= 10*2+3