Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using egrep from sorted by
Terminal - Commands using egrep - 183 results
aptitude remove $(dpkg -l|egrep '^ii linux-(im|he)'|awk '{print $2}'|grep -v `uname -r`)
2010-06-10 21:23:00
User: dbbolton
Functions: awk egrep grep
8

This should do the same thing and is about 70 chars shorter.

find . -type f | sed 's,.*,stat "&" | egrep "File|Modify" | tr "\\n" " " ; echo ,' | sh | sed 's,[^/]*/\(.*\). Modify: \(....-..-.. ..:..:..\).*,\2 \1,' | sort
d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
2010-05-18 07:38:54
User: lrvick
Functions: c++ egrep sed
9

Requires aria2c but could just as easily wget or anything else.

A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-)

(for i in `find . -maxdepth 2 -name .svn | sed 's/.svn$//'`; do echo $i; svn info $i; done ) | egrep '^.\/|^URL'
2010-05-09 11:54:37
User: jespere
Functions: echo egrep info sed
0

If you have lots of subversion working copies in one directory and want to see in which repositories they are stored, this will do the trick. Can be convenient if you need to move to a new subversion server.

find ./ -name *.h -exec egrep -cH "// | /\*" {} \; | awk -F':' '{print $2 ":" $1}' | sort -gr
2010-04-23 19:00:07
User: blocky
Functions: awk egrep find sort
1

This shows you which files are most in need of commenting (one line of output per file)

printf "\n%25s%10sTOTAL\n" 'FILE TYPE' ' '; for ext in $(find . -iname \*.* | egrep -o '\.[^[:space:].]+$' | egrep -v '\.svn*' | sort -f | uniq -i); do count=$(find . -iname \*$ext | wc -l); printf "%25s%10s%d\n" $ext ' ' $count; done
2010-04-16 21:12:11
User: rkulla
Functions: egrep find printf sort uniq wc
0

I created this command to give me a quick overview of how many file types a directory, and all its subdirectories, contains. It works based off file extension, rather than file(1)'s magic output, because it ended up being more accurate and less confusing.

Files that don't have an ext (README) are generally not important for me to want to count, but you're free to customize this fit your needs.

cat /etc/apache2/sites-enabled/* | egrep 'ServerAlias|ServerName' | tr -s ' ' | sed 's/^\s//' | cut -d ' ' -f 2 | sed 's/www.//' | sort | uniq
2010-04-08 15:50:34
User: chronosMark
Functions: cat cut egrep sed sort tr
2

Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server. A slightly shorter version.

cat /etc/apache2/sites-enabled/* | egrep 'ServerAlias|ServerName' | tr -s " " | sed 's/^[ ]//g' | uniq | cut -d ' ' -f 2 | sed 's/www.//g' | sort | uniq
2010-04-08 08:51:17
User: chronosMark
Functions: cat cut egrep sed sort tr uniq
0

Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server.

ls | egrep -v "[REGULAR EXPRESSION]" | xargs rm -v
2010-04-01 02:40:40
User: Saxphile
Functions: egrep ls rm xargs
Tags: files rm
-1

This is a slight variation of an existing submission, but uses regular expression to look for files instead. This makes it vastly more versatile, and one can easily verify the files to be kept by running ls | egrep "[REGULAR EXPRESSION]"

du -kd | egrep -v "/.*/" | sort -n
2010-03-30 15:40:35
User: rmbjr60
Functions: du egrep sort
-1

Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.

rpm --querytags | egrep -v HEADERIMMUTABLE | sort | while read tag ; do rpm -q --queryformat "$tag: [%{$tag} ]\n" -p $SomeRPMfile ; done
2010-03-25 05:40:48
Functions: egrep read rpm sort
0

If you want to relocate a package on your own, or you just want to know what those PREIN/UN and POSTIN/UN scripts will do, this will dump out all that detail simply.

You may want to expand the egrep out other verbose flags like CHANGELOGTEXT etc, as your needs require.

It isn't clear, but the formatting around $tag is important: %{$tag} just prints out the first line, while [%{$tag }] iterates thru multi-line output, joining the lines with a space (yes, there's a space between the g and } characters. To break it out for all newlines, use [%{$tag\n}] but the output will be long.

This is aside from rpm2cpio | cpio -ivd to extract the package files.

pattern='regexp_pattern'; find . -type f -perm +220 ! -name '*.bak' -print0 | xargs -0 egrep -lZ $pattern | xargs -0 sed -i.bak -e "/$pattern/d"
lsof -c $processname | egrep 'w.+REG' | awk '{print $9}' | sort | uniq
2010-02-24 16:47:49
User: alustenberg
Functions: awk egrep sort
1

lists all files that are opened by processess named $processname

egrep 'w.+REG' is to filter out non file listings in lsof, awk to get the filenames, and sort | uniq to remove duplciation

sudo aptitude update; sudo apt-get -y --print-uris upgrade | egrep -o -e "http://[^\']+" | sudo aria2c -c -d /var/cache/apt/archives -i -; sudo aptitude -y safe-upgrade
2010-02-18 16:02:29
User: freethinker
Functions: egrep sudo
2

Please install aria2c before you try the above command. On ubuntu the command to install aria2c would be:

sudo aptitude install aria2
for i in `ndd /dev/ip \? | awk '{ print $1 }' | egrep -v "ip6|status|icmp|igmp|\?"` ; do echo $i `ndd -get /dev/ip $i` ; done | grep -v \?
2010-02-15 12:32:33
User: felix001
Functions: awk echo egrep grep
0

This command is jsut for the main IP settings of ndd. if you need ip6 or icmp edit the text within the egrep inclusion area.

Felix001 - www.Fir3net.com

function svnurl() { svn info $1 | egrep '^URL: (.*)' | sed s/URL\:\ //; }
2010-02-12 15:42:54
User: thebuckst0p
Functions: egrep info sed
0

Can be used in a working copy to output the URL (extracted from svn info), or as part of another function, as $(svnurl some/path). Saves a lot of time in my SVN workflow.

sudo netselect -v -s3 $(curl -s http://dns.comcast.net/dns-ip-addresses2.php | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' | sort | uniq)
2010-01-27 00:03:44
User: hackerb9
Functions: egrep sort sudo
2

Comcast is an ISP in the United States that has started hijacking DNS requests as a "service" for its customers. For example, in Firefox, one used to be able to do a quick "I'm Feeling Lucky" Google search by typing a single word into the URL field, assuming the word is not an existing domain when surrounded by www.*.com. Comcast customers never receive the correct NX (non-existent domain) error from DNS. Instead, they are shown a page full of advertising. There is a way to "opt out" from their service, but that requires having the account password and the MAC address of your modem handy. For me, it was easier just to set static DNS servers. But the problem is, which ones to choose? That's what this command answers. It'll show you the three _non-hijacked_ Comcast DNS servers that are the shortest distance away.

Perhaps you don't have Comcast (lucky you!), but hopefully this command can serve as an example of using netselect to find the fastest server from a list. Note that, although this example doesn't show it, netselect will actually perform the uniq and DNS resolution for you.

Requires: netselect, curl, sort, uniq, grep

nmap -T4 -sP 192.168.2.0/24 && egrep "00:00:00:00:00:00" /proc/net/arp
nmap -sP <subnet>.* | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' > results.txt ; for IP in {1..254} ; do echo "<subnet>.${IP}" ; done >> results.txt ; cat results.txt | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 | uniq -u
lynx --dump --source http://www.xkcd.com | grep `lynx --dump http://www.xkcd.com | egrep '(png|jpg)'` | grep title | cut -d = -f2,3 | cut -d '"' -f2,4 | sed -e 's/"/|/g' | awk -F"|" ' { system("display " $1);system("echo "$2); } '
2009-12-03 18:53:57
Functions: awk cut egrep grep
-1

Same thing just a different way to get there. You will need lynx

egrep -i "somepattern" `find . -type f -print`
find . -type f | perl -lne 'print if -T;' | xargs egrep "somepattern"
egrep 'https?://([[:alpha:]]([-[:alnum:]]+[[:alnum:]])*\.)+[[:alpha:]]{2,3}(:\d+)?(/([-\w/_\.]*(\?\S+)?)?)?'
2009-11-28 15:41:42
User: putnamhill
Functions: egrep
5

For the record: I didn't build this. Just shared what I found that worked. Apologies to the original author!

I decided I should fix the case where http://example.com is not matched for the next time I need this. So I read rfc1035 and formalized the host name regex.

If anyone finds any more holes, please comment.

egrep "<link>|<title>" recenttracks.rss | awk 'ORS=NR%2?" ":"\n"' | awk -F "</title>" '{print $2, $1}' | sed -e 's/\<link\>/\<li\>\<a href\=\"/' -e 's/\<\/link\>/\">/' -e 's/\<title\>//' -e 's/$/\<\/a\>\<\/li\>/g' -e '1,1d' -e 's/^[ \t]*//'
2009-11-28 13:19:05
User: HerbT
Functions: awk egrep sed
3

Quick and kludgy rss parser for the recent tracks rss feed from last.fm. Extracts artist and track link.

lsmod | cut -d' ' -f1 | xargs modinfo | egrep '^file|^desc|^dep' | sed -e'/^dep/s/$/\n/g'
2009-11-17 02:13:34
User: mohan43u
2

Run this as root, it will be helpful to quickly get information about the loaded kernel modules.