Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Commands using grep from sorted by
Terminal - Commands using grep - 1,645 results
grep -rF --include='*.txt' stringYouLookFor *
2010-02-12 18:07:07
User: maxleonca
Functions: grep
1

The -r is for recursive, -F for fixed strings, --include='*.txt' identifies you want all txt files to be search any wildcard will apply, then the string you are looking for and the final * to ensure you go through all files and folders within the folder you execute it.

curl -s -H "Authorization: GoogleLogin auth=$auth" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc
2010-02-11 00:42:57
User: matthewbauer
Functions: grep sed tr
-1

Get Google Reader unread count from the command line.

You'll have to define your auth token with $auth

Or use:

curl -s -H "Authorization: GoogleLogin auth=$(curl -sd "Email=$email&Passwd=$password&service=reader" https://www.google.com/accounts/ClientLogin | grep Auth | sed 's/Auth=\(.*\)/\1/')" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc
ls . | xargs file | grep text | sed "s/\(.*\):.*/\1/" | xargs gedit
find -type f -regex ".*\.\(js\|php\|inc\|htm[l]?\|css\)$" -exec grep -il 'searchstring' '{}' +
find . -type f \( -name "*.js" -o -name "*.php" -o -name "*.inc" -o -name "*.html" -o -name "*.htm" -o -name "*.css" \) -exec grep -il 'searchString' {} \;
2010-02-07 15:28:20
User: niels_bom
Functions: find grep
Tags: find grep search
-1

Use find to recursively make a list of all files from the current directory and downwards. The files have to have an extension of the ones listed. Then for every file found, grep it for 'searchString', returns the filename if searchString is found.

lynx -dump http://api.wunderground.com/weatherstation/WXCurrentObXML.asp?ID=KCALOSAN32 | grep GMT | awk '{print $3}'
2010-02-05 19:17:18
User: editorreilly
Functions: awk grep
5

Get your weather from a weather station just blocks from your home. Go to http://www.wunderground.com/wundermap/ and find a weather station near you. Click on a temperature bubble for that area. When the window pops up, click on hypertext link with the station ID, then on the bottom right of the page, click on the Current Conditions XML. Thats your link! Good luck!

curl -s https://www.google.com/accounts/ClientLogin -d Email=$email -d Passwd=$password -d service=lh2 | grep Auth | sed 's/Auth=\(.*\)/\1/'
mailq | grep MAILER-DAEMON | awk ?{print $1}? | tr -d ?*? | postsuper -d -
curl -s http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25] | grep -oP "^\w+\(\)\ *{.*}"
export QQ=$(mktemp -d);(cd $QQ; curl -s -O http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25];for i in $(perl -ne 'print "$1\n" if( /^(\w+\(\))/ )' *|sort -u);do grep -h -m1 -B1 $i *; done)|grep -v '^--' > clf.sh;rm -r $QQ
2010-01-30 19:47:42
User: bartonski
Functions: cd export grep mktemp perl sort
8

Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file.

Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
2010-01-30 13:08:03
User: gthb
Functions: grep sed
7

This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed).

Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep
4

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/

for i in in $(vgdisplay -v vg00 | grep "LV Name" | awk '{ print $3 };'); do; lvextend -m 1 $i /dev/disk/<here-goes-the-disk>; done
2010-01-29 22:43:07
User: jreypo
Functions: awk grep
Tags: hp-ux lvm
0

Create one mirror copy of every lvol in the vg00 just after a cold install of an HP-UX 11.31. Cna be used also for 11.23 but remember that in 11iv2 there is no agile view so the disk will be /dev/dsk/cxtxdxs2

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
2010-01-29 05:01:11
User: eightmillion
Functions: grep perl
18

This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}

If your version of grep doesn't have perl compatible regex support, then you can use this version:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
for i in `netstat -rn |grep lan |cut -c55-60 |sort |uniq`; do ifconfig $i; done
2010-01-28 17:35:20
User: Kaio
Functions: cut grep ifconfig sort
-5

HP UX doesn't have a -a switch in the ifconfig command.

This line emulates the same result shown in Solaris, AIX or Linux

a="www.commandlinefu.com";b="/index.php";for n in $(seq 1 7);do echo -en "GET $b HTTP/1.0\r\nHost: "$a"\r\n\r\n" |nc $a 80 2>&1 |grep Set-Cookie;done
2010-01-28 14:19:43
User: vlan7
Functions: echo grep seq
Tags: bash cookies
3

The loop is to compare cookies. You can remove it...

Maybe you wanna use curl...

curl www.commandlinefu.com/index.php -s0 -I | grep "Set-Cookie"
find directory/ -exec grep -ni phrase {} +
2010-01-28 12:15:24
User: sanmiguel
Functions: find grep
Tags: find grep
0

The difference between this and the other alternatives here using only grep is that find will, by default, not follow a symlink. In some cases, this is definitely desirable.

Using find also allows you to exclude certain files, eg

find directory/ ! -name "*.tmp" -exec grep -ni phrase {} +

would allow you to exclude any files .tmp files.

Also note that there's no need for calling grep recursively, as find passes each found file to grep.

ipcs -a | grep 0x | awk '{printf( "-Q %s ", $1 )}' | xargs ipcrm
svn add $(svn st|grep ^\?|cut -c2-)
2010-01-28 09:48:46
User: inkel
Functions: cut grep
Tags: bash svn grep cut
0

This version makes uses of Bash shell expansion, so it might not work in all other shells.

svn status |grep '\?' |awk '{print $2}'| parallel -Xj1 svn add
2010-01-28 08:47:54
Functions: awk grep
Tags: xargs parallel
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
2010-01-28 08:44:16
Functions: grep sed
4

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

grep -E '^(cn|mail):' file.ldif | sed -e 's/^[a-z]*: //'
find filepath -type f -iname "*.html" -o -iname "*.htm" -o -iname "*.php" | xargs grep "Exception\|LGPL\|CODE1"
mgc() { grep --exclude=cscope* --color=always -rni $1 . |perl -pi -e 's/:/ +/' |perl -pi -e 's/^(.+)$/vi $1/g' |perl -pi -e 's/:/ /'; }
2010-01-26 17:00:01
Functions: grep perl
1

This is a big time saver for me. I often grep source code and need to edit the findings. A single highlight of the mouse and middle mouse click (in gnome terminal) and I'm editing the exact line I just found. The color highlighting helps interpret the data.

grep -rni string dir
2010-01-26 16:34:06
Functions: grep
2

Print line numbers also, so you don't have to search through the files once its open for the string you already grepped for.