Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Commands using egrep from sorted by
Terminal - Commands using egrep - 195 results
sudo aptitude update; sudo apt-get -y --print-uris upgrade | egrep -o -e "http://[^\']+" | sudo aria2c -c -d /var/cache/apt/archives -i -; sudo aptitude -y safe-upgrade
2010-02-18 16:02:29
User: freethinker
Functions: egrep sudo
2

Please install aria2c before you try the above command. On ubuntu the command to install aria2c would be:

sudo aptitude install aria2
for i in `ndd /dev/ip \? | awk '{ print $1 }' | egrep -v "ip6|status|icmp|igmp|\?"` ; do echo $i `ndd -get /dev/ip $i` ; done | grep -v \?
2010-02-15 12:32:33
User: felix001
Functions: awk echo egrep grep
0

This command is jsut for the main IP settings of ndd. if you need ip6 or icmp edit the text within the egrep inclusion area.

Felix001 - www.Fir3net.com

function svnurl() { svn info $1 | egrep '^URL: (.*)' | sed s/URL\:\ //; }
2010-02-12 15:42:54
User: thebuckst0p
Functions: egrep info sed
0

Can be used in a working copy to output the URL (extracted from svn info), or as part of another function, as $(svnurl some/path). Saves a lot of time in my SVN workflow.

sudo netselect -v -s3 $(curl -s http://dns.comcast.net/dns-ip-addresses2.php | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' | sort | uniq)
2010-01-27 00:03:44
User: hackerb9
Functions: egrep sort sudo
2

Comcast is an ISP in the United States that has started hijacking DNS requests as a "service" for its customers. For example, in Firefox, one used to be able to do a quick "I'm Feeling Lucky" Google search by typing a single word into the URL field, assuming the word is not an existing domain when surrounded by www.*.com. Comcast customers never receive the correct NX (non-existent domain) error from DNS. Instead, they are shown a page full of advertising. There is a way to "opt out" from their service, but that requires having the account password and the MAC address of your modem handy. For me, it was easier just to set static DNS servers. But the problem is, which ones to choose? That's what this command answers. It'll show you the three _non-hijacked_ Comcast DNS servers that are the shortest distance away.

Perhaps you don't have Comcast (lucky you!), but hopefully this command can serve as an example of using netselect to find the fastest server from a list. Note that, although this example doesn't show it, netselect will actually perform the uniq and DNS resolution for you.

Requires: netselect, curl, sort, uniq, grep

nmap -T4 -sP 192.168.2.0/24 && egrep "00:00:00:00:00:00" /proc/net/arp
nmap -sP <subnet>.* | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' > results.txt ; for IP in {1..254} ; do echo "<subnet>.${IP}" ; done >> results.txt ; cat results.txt | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 | uniq -u
lynx --dump --source http://www.xkcd.com | grep `lynx --dump http://www.xkcd.com | egrep '(png|jpg)'` | grep title | cut -d = -f2,3 | cut -d '"' -f2,4 | sed -e 's/"/|/g' | awk -F"|" ' { system("display " $1);system("echo "$2); } '
2009-12-03 18:53:57
Functions: awk cut egrep grep
-1

Same thing just a different way to get there. You will need lynx

egrep -i "somepattern" `find . -type f -print`
find . -type f | perl -lne 'print if -T;' | xargs egrep "somepattern"
egrep 'https?://([[:alpha:]]([-[:alnum:]]+[[:alnum:]])*\.)+[[:alpha:]]{2,3}(:\d+)?(/([-\w/_\.]*(\?\S+)?)?)?'
2009-11-28 15:41:42
User: putnamhill
Functions: egrep
5

For the record: I didn't build this. Just shared what I found that worked. Apologies to the original author!

I decided I should fix the case where http://example.com is not matched for the next time I need this. So I read rfc1035 and formalized the host name regex.

If anyone finds any more holes, please comment.

egrep "<link>|<title>" recenttracks.rss | awk 'ORS=NR%2?" ":"\n"' | awk -F "</title>" '{print $2, $1}' | sed -e 's/\<link\>/\<li\>\<a href\=\"/' -e 's/\<\/link\>/\">/' -e 's/\<title\>//' -e 's/$/\<\/a\>\<\/li\>/g' -e '1,1d' -e 's/^[ \t]*//'
2009-11-28 13:19:05
User: HerbT
Functions: awk egrep sed
3

Quick and kludgy rss parser for the recent tracks rss feed from last.fm. Extracts artist and track link.

lsmod | cut -d' ' -f1 | xargs modinfo | egrep '^file|^desc|^dep' | sed -e'/^dep/s/$/\n/g'
2009-11-17 02:13:34
User: mohan43u
2

Run this as root, it will be helpful to quickly get information about the loaded kernel modules.

geoip () { curl -s "http://www.geoiptool.com/?IP=$1" | html2text | egrep --color 'City:|IP Address:|Country:' }
2009-11-15 17:59:23
User: wizel
Functions: egrep
0

If used without arguments, returns own IP info.

If used with argument, returns info about the parsed argument.

ls -a | egrep "^\.\w"
2009-11-11 18:19:56
User: kulor
Functions: egrep ls
Tags: egrep ls dotfiles
-2

trying to copy all your dotfiles from one location to another, this may help

egrep -v "^[[:blank:]]*($|#|//|/\*| \*|\*/)" somefile
find ~/Maildir/ -mindepth 1 -type d | egrep -v '/cur$|/tmp$|/new$' | xargs
tail -F file | egrep --color 'pattern|$'
tail -f file | egrep --color=always $\|PATTERN
2009-10-15 13:08:30
User: sitaram
Functions: egrep file tail
Tags: color
-2

but you can't see the colors in that sample output :(

echo $X | egrep "^[0-9]+$"
egrep 'Failed password for invalid' /var/log/secure | awk '{print $13}' | uniq
2009-10-04 18:08:13
Functions: awk egrep
1

Work for me on CentOS, grep and print ip addresses of ssh bruteforce attempts

echo 127.0.0.1 | egrep -e '^(([01]?[0-9]{1,2}|2[0-4][0-9]|25[0-4])\.){3}([01]?[0-9]{1,2}|2[0-4][0-9]|25[0-4])$'
2009-09-17 17:40:48
User: arcege
Functions: echo egrep
-1

Handles everything except octets with 255. Ran through ip generator with variable octet lengths.

function sepath { echo $PATH |tr ":" "\n" |sort -u |while read L ; do cd "$L" 2>/dev/null && find . \( ! -name . -prune \) \( -type f -o -type l \) 2>/dev/null |sed "s@^\./@@" |egrep -i "${*}" |sed "s@^@$L/@" ; done ; }
2009-09-11 15:03:22
User: mobidyc
Functions: cd echo egrep find read sed sort tr
Tags: bash ksh PATH
-1

search argument in PATH

accept grep expressions

without args, list all binaries found in PATH

function catv { egrep -v "^$|^#" ${*} ; }
2009-09-11 14:58:47
User: mobidyc
Functions: egrep
1

better integration.

works on all Unices

works one bash and ksh.

wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv
2009-08-31 18:37:33
User: syssyphus
Functions: egrep wget xargs
10

xargs can be used in this manner to download multiple files at a time, and xargs will in this case run 10 processes at a time and initiate a new one when the number running falls below 10.

curl -s http://www.commandlinefu.com/commands/browse|egrep '("Fin.*and"|<div class="command">.*</div>)'|sed 's/<[^<]*>//g'|ruby -rubygems -pe 'require "cgi"; $_=sprintf("\n\n%-100s\n\t#%-20s",CGI.unescapeHTML($_).chomp.strip, gets.lstrip) if $.%2'
2009-08-18 19:04:03
User: copremesis
Functions: egrep sed
1

just bored here at work ... if your are daring ... add '| bash' .... enjoy

require 'ruby'