Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using grep from sorted by
Terminal - Commands using grep - 1,655 results
spellcheck(){ curl -sd "<spellrequest><text>$1</text></spellrequest>" https://www.google.com/tbproxy/spell | sed 's/.*<spellresult [^>]*>\(.*\)<\/spellresult>/\1/;s/<c \([^>]*\)>\([^<]*\)<\/c>/\1;\2\n/g' | grep 's="1"' | sed 's/^.*;\([^\t]*\).*$/\1/'; }
ls *.jpg | grep -n "" | sed 's,.*,0000&,' | sed 's,0*\(...\):\(.*\).jpg,mv "\2.jpg" "image-\1.jpg",' | sh
curl -s -u $username:$password http://192.168.1.1/DHCPTable.htm | grep '<td>.* </td>' | sed 's|\t<td>\(.*\) </td>\r|\1|' | tr '\n' ';' | sed 's/\([^;]*\);\([^;]*\);/\2\t\1\n/g'
2010-02-16 02:27:11
User: matthewbauer
Functions: grep sed tr
0

Will create a sample etc host file based on your router's dhcp list.

Now I know this won't work on most routers, so please don't downvote it just because it doesn't work for you.

for i in `ndd /dev/ip \? | awk '{ print $1 }' | egrep -v "ip6|status|icmp|igmp|\?"` ; do echo $i `ndd -get /dev/ip $i` ; done | grep -v \?
2010-02-15 12:32:33
User: felix001
Functions: awk echo egrep grep
0

This command is jsut for the main IP settings of ndd. if you need ip6 or icmp edit the text within the egrep inclusion area.

Felix001 - www.Fir3net.com

pdftotext [file] - | grep 'YourPattern'
2010-02-14 21:42:35
User: drewk
Functions: grep
Tags: pipe grep pdf
27

PDF files are simultaneously wonderful and heinous. They are wonderful in being ubiquitous and mostly being cross platform. They are heinous in being very difficult to work with from the command line, search, grep, use only the text inside the PDF, or use outside of proprietary products.

xpdf is a wonderful set of PDF tools. It is on many linux distros and can be installed on OS X. While primarily an open PDF viewer for X, xpdf has the tool "pdftotext" that can extract formated or unformatted text from inside a PDF that has text. This text stream can then be further processed by grep or other tool. The '-' after the file name directs output to stdout rather than to a text file the same name as the PDF.

Make sure you use version 3.02 of pdftotext or later; earlier versions clipped lines.

The lines extracted from a PDF without the "-layout" option are very long. More paragraphs. Use just to test that a pattern exists in the file. With "-layout" the output resembles the lines, but it is not perfect.

xpdf is available open source at http://www.foolabs.com/xpdf/

grep -c ^processor /proc/cpuinfo
xrandr -q | grep -w Screen
username=matthewbauer; curl -s http://www.commandlinefu.com/commands/by/$username/json | tr '{' '\n' | grep -Eo ',"votes":"[0-9\-]+","' | grep -Eo '[0-9\-]+' | tr '\n' '+' | sed 's/+$/\n/' | bc
2010-02-14 04:32:36
User: matthewbauer
Functions: grep sed tr
8

This will calculate the your commandlinefu votes (upvotes - downvotes).

Hopefully this will boost my commandlinefu points.

curl -s http://www.google.com/ig/api?weather=$(curl -s "http://api.hostip.info/get_html.php?ip=$(curl -s icanhazip.com)" | grep City | sed 's/City: \(.*\)/\1/' | sed 's/ /%20/g' | sed "s/'/%27/g") | sed 's|.*<temp_f data="\([^"]*\)"/>.*|\1\n|'
uri_escape(){ echo -E "$@" | sed 's/\\/\\\\/g;s/./&\n/g' | while read -r i; do echo $i | grep -q '[a-zA-Z0-9/.:?&=]' && echo -n "$i" || printf %%%x \'"$i" done }
2010-02-13 01:39:51
User: infinull
Functions: echo grep printf read sed
1

This one uses hex conversion to do the converting and is in shell/sed only (should probably still use the python/perl version).

grep -rF --include='*.txt' stringYouLookFor *
2010-02-12 18:07:07
User: maxleonca
Functions: grep
1

The -r is for recursive, -F for fixed strings, --include='*.txt' identifies you want all txt files to be search any wildcard will apply, then the string you are looking for and the final * to ensure you go through all files and folders within the folder you execute it.

curl -s -H "Authorization: GoogleLogin auth=$auth" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc
2010-02-11 00:42:57
User: matthewbauer
Functions: grep sed tr
-1

Get Google Reader unread count from the command line.

You'll have to define your auth token with $auth

Or use:

curl -s -H "Authorization: GoogleLogin auth=$(curl -sd "Email=$email&Passwd=$password&service=reader" https://www.google.com/accounts/ClientLogin | grep Auth | sed 's/Auth=\(.*\)/\1/')" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc
ls . | xargs file | grep text | sed "s/\(.*\):.*/\1/" | xargs gedit
find -type f -regex ".*\.\(js\|php\|inc\|htm[l]?\|css\)$" -exec grep -il 'searchstring' '{}' +
find . -type f \( -name "*.js" -o -name "*.php" -o -name "*.inc" -o -name "*.html" -o -name "*.htm" -o -name "*.css" \) -exec grep -il 'searchString' {} \;
2010-02-07 15:28:20
User: niels_bom
Functions: find grep
Tags: find grep search
-1

Use find to recursively make a list of all files from the current directory and downwards. The files have to have an extension of the ones listed. Then for every file found, grep it for 'searchString', returns the filename if searchString is found.

lynx -dump http://api.wunderground.com/weatherstation/WXCurrentObXML.asp?ID=KCALOSAN32 | grep GMT | awk '{print $3}'
2010-02-05 19:17:18
User: editorreilly
Functions: awk grep
5

Get your weather from a weather station just blocks from your home. Go to http://www.wunderground.com/wundermap/ and find a weather station near you. Click on a temperature bubble for that area. When the window pops up, click on hypertext link with the station ID, then on the bottom right of the page, click on the Current Conditions XML. Thats your link! Good luck!

curl -s https://www.google.com/accounts/ClientLogin -d Email=$email -d Passwd=$password -d service=lh2 | grep Auth | sed 's/Auth=\(.*\)/\1/'
mailq | grep MAILER-DAEMON | awk ?{print $1}? | tr -d ?*? | postsuper -d -
curl -s http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25] | grep -oP "^\w+\(\)\ *{.*}"
export QQ=$(mktemp -d);(cd $QQ; curl -s -O http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25];for i in $(perl -ne 'print "$1\n" if( /^(\w+\(\))/ )' *|sort -u);do grep -h -m1 -B1 $i *; done)|grep -v '^--' > clf.sh;rm -r $QQ
2010-01-30 19:47:42
User: bartonski
Functions: cd export grep mktemp perl sort
8

Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file.

Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
2010-01-30 13:08:03
User: gthb
Functions: grep sed
7

This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed).

Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep
4

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/

for i in in $(vgdisplay -v vg00 | grep "LV Name" | awk '{ print $3 };'); do; lvextend -m 1 $i /dev/disk/<here-goes-the-disk>; done
2010-01-29 22:43:07
User: jreypo
Functions: awk grep
Tags: hp-ux lvm
0

Create one mirror copy of every lvol in the vg00 just after a cold install of an HP-UX 11.31. Cna be used also for 11.23 but remember that in 11iv2 there is no agile view so the disk will be /dev/dsk/cxtxdxs2

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
2010-01-29 05:01:11
User: eightmillion
Functions: grep perl
18

This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}

If your version of grep doesn't have perl compatible regex support, then you can use this version:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
for i in `netstat -rn |grep lan |cut -c55-60 |sort |uniq`; do ifconfig $i; done
2010-01-28 17:35:20
User: Kaio
Functions: cut grep ifconfig sort
-5

HP UX doesn't have a -a switch in the ifconfig command.

This line emulates the same result shown in Solaris, AIX or Linux