Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged bash from sorted by
Terminal - Commands tagged bash - 716 results
echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
2010-03-12 08:42:49
User: AskApache
Functions: echo wc
1

There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100.

OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 ))

This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript).

An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command.

find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p'

I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this:

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this.

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

If you were building a process-killer (very common for cheap hosting) this would also be handy.

Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
2010-03-09 21:48:41
User: busybee
Functions: alias awk find head sort vim wc
22

This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.

file -L <library> | grep -q '64-bit' && echo 'library is 64 bit' || echo 'library is 32 bit'
2010-03-07 06:31:35
User: infinull
Functions: echo file grep
Tags: bash
-3

file displays a files type

the -L flag means follow sym-links (as libraries are often sym-linked to another this behavior is likely preferred)

more complex behavior (*two* grep commands!) could be used to determine if the file is or is not a shared library.

libquery=/lib32/libgcc_s.so.1; if [ `nm -D $libquery | sed -n '/[0-9A-Fa-f]\{8,\}/ {p; q;}' | grep "[0-9A-Fa-f]\{16\}" | wc -l` == 1 ]; then echo "$libquery is a 64 bit library"; else echo "$libquery is a 32 bit library"; fi;
2010-03-07 04:24:08
User: birnam
Functions: echo grep sed wc
Tags: bash nm
2

Determines the flavor of a shared library by looking at the addresses of its exposed functions and seeing if they are 16 bytes or 8 bytes long. The command is written so the library you are querying is passed to a variable up font -- it would be simple to convert this to a bash function or script using this format.

box(){ c=${2-=}; l=$c$c${1//?/$c}$c$c; echo -e "$l\n$c $1 $c\n$l"; unset c l;}
2010-02-26 17:14:52
User: mightybs
Functions: c++ echo unset
Tags: bash echo
2

First argument: string to put a box around.

Second argument: character to use for box (default is '=')

Same as command #4962, cleaned up, shortened, and more efficient. Now a ' * ' can be used as the box character, and the variables get unset so they don't mess with anything else you might have.

They marked c++ as a function for this command, but I'm not sure why. Must be a bug.

any_script.sh < <(some command)
2010-02-21 18:44:33
User: cb0
Tags: bash
9

Sometimes you have a script that needs and inputfile for execution. If you don't want to create one because it may contain only one line you can use the `

mysql -uuser -ppass dbname < <(echo "SELECT * FROM database;")

This can be very usefull when working with mysql as I showed in the example code above. This will create a temporary file that is used to execute mysql and for example select all entrys from a specific database.

SITE="www.google.com"; curl --silent "http://www.shadyurl.com/create.php?myUrl=$SITE&shorten=on" | awk -F\' '/is now/{print $6}'
xv() { case $- in *[xv]*) set +xv;; *) set -xv ;; esac }
2010-02-14 20:57:29
User: cfajohnson
Functions: set
4

Turn shell tracing and verbosity (set -xv) on/off in any Bourne-type shell

If either -x or -v is set, the function turns them both off.

If neither is on, both are turned on.

function setx(){ sed '/[xv]/!Q2' <<< $- && { set +xv; export PS4=">>> "; } || { export PS4="`tput setaf 3`>>> `tput sgr0`"; set -xv; }; }
2010-02-14 01:25:44
User: AskApache
Functions: export sed set
1

Running this command turns shell tracing and shell verbose debugging on or off. Not only does it do that, it also uses your terminals builtin method of setting colors to make debugging much easier.

It looks at the current shell options contained in the $- special bash variable and that lets this function set the opposite of the current value. So from the shell you could do a:

setx; echo "y" | ( cat -t ) | echo "d"; setx

and it will turn on debbuggin.

This is an amazingly useful function that is perfect to add system-wide by adding it to /etc/profile or /etc/bashrc.. You can run it from the shell, and you can also use it in your shell scripts like my .bash_profile - http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

<(!!)
2010-02-06 18:35:10
User: drewk
7

Bash has a great history system of its commands accessed by the ! built-in history expansion operator (documented elsewhere on this site or on the web). You can combine the ! operator inside the process redirection

Very handy.

find /path/to/dir -type f -printf "%T@|%p\n" 2>/dev/null | sort -n | tail -n 1| awk -F\| '{print $2}'
newest () { find ${1:-\.} -type f |xargs ls -lrt ; }
newest () { DIR=${1:-'.'}; CANDIDATE=`find $DIR -type f|head -n1`; while [[ ! -z $CANDIDATE ]]; do BEST=$CANDIDATE; CANDIDATE=`find $DIR -newer "$BEST" -type f|head -n1`; done; echo "$BEST"; }
2010-02-04 12:40:44
User: shadycraig
Functions: echo head
1

Works recusivley in the specified dir or '.' if none given.

Repeatedly calls 'find' to find a newer file, when no newer files exist you have the newest.

In this case 'newest' means most recently modified. To find the most recently created change -newer to -cnewer.

fbemailscraper YourFBEmail Password
2010-01-31 00:44:35
User: dabom
4

(Apparently it is too long so I put it in sample output, I hope that is OK.)

Run the long command (or put it in your .bashrc) in sample output then run:

fbemailscraper YourFBEmail Password

Voila! Your contacts' emails will appear.

Facebook seems to have gotten rid of the picture encoding of emails and replaced it with a text based version making it easy to scrape!

Needs curl to run and it was made pretty quickly so there might be bugs.

echo alias xkcd="gwenview `w3m -dump http://xkcd.com/|grep png | awk '{print $5}'` 2> /dev/null" >> .bashrc
2010-01-30 20:38:16
User: GinoMan2440
Functions: alias echo
-5

Add an alias to your .bashrc that allows you to issue the command xkcd to view (with gwenview) the newest xkcd comic... I know there are thousands of them out there but this one is at least replete with installer and also uses a more concise syntax... plus, gwenview shows you the downloading progress as it downloads the comic and gives you a more full featured viewing experience.

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
2010-01-30 13:08:03
User: gthb
Functions: grep sed
7

This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed).

Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

cp -arv ~/Documents/{foo,bar} --target-directory=~/buzz/
rename 's/ /_/g' *
2010-01-29 21:15:03
User: tuxilicious
Functions: rename
Tags: bash
8

It's the rename-tool from debians perl-package.

for f in *;do mv "$f" "${f// /_}";done
2010-01-29 19:57:16
User: ethanmiller
Functions: mv
Tags: bash
9

I realize there's a few of these out there, but none exactly in this form, which seems the cleanest to me

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
2010-01-29 05:01:11
User: eightmillion
Functions: grep perl
18

This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}

If your version of grep doesn't have perl compatible regex support, then you can use this version:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
a="www.commandlinefu.com";b="/index.php";for n in $(seq 1 7);do echo -en "GET $b HTTP/1.0\r\nHost: "$a"\r\n\r\n" |nc $a 80 2>&1 |grep Set-Cookie;done
2010-01-28 14:19:43
User: vlan7
Functions: echo grep seq
Tags: bash cookies
3

The loop is to compare cookies. You can remove it...

Maybe you wanna use curl...

curl www.commandlinefu.com/index.php -s0 -I | grep "Set-Cookie"
svn add $(svn st|grep ^\?|cut -c2-)
2010-01-28 09:48:46
User: inkel
Functions: cut grep
Tags: bash svn grep cut
0

This version makes uses of Bash shell expansion, so it might not work in all other shells.

<ALT> .
<CTRL+w>
<ALT> <BACKSPACE>
2010-01-27 19:52:51
User: wincus
2

hit BACKSPACE more than once to delete more words