Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged grep from sorted by
Terminal - Commands tagged grep - 351 results
grep -i '^DocumentRoot' /etc/httpd/conf/httpd.conf | cut -f2 -d'"'
vim --version | grep -P '^(\+|\-)' | sed 's/\s/\n/g' | grep -Pv '^ ?$'
2010-07-02 02:57:19
User: evaryont
Functions: grep sed vim
Tags: vim sed grep
2

The above output is for a custom compiled version of Vim on Arch Linux.

Just a quick shell one liner, and presents a list of all the enabled and disabled (those prefixed with a '-') features.

curl -s "http://www.socrata.com/api/views/vedg-c5sb/rows.json?search=Axelrod" | grep "data\" :" | awk '{ print $17 }'
2010-07-01 23:54:54
User: mheadd
Functions: awk grep
Tags: awk grep curl
2

Query the Socrata Open Data API being used by the White House to find any employee's salary using curl, grep and awk.

Change the value of the search parameter (example uses Axelrod) to the name of any White House staffer to see their annual salary.

function mg(){ man ${1} | egrep ${2} | more; }
2010-07-01 21:14:24
User: quincymd
Functions: egrep man
Tags: man grep
0

Quicker way to search man pages of command for key word

ifconfig eth0 | grep -o "inet [^ ]*" | cut -d: -f2
ifconfig eth0 | awk '/inet / {print $2}' | cut -d ':' -f2
wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]} ' | grep -v "^$"
2010-06-29 16:22:03
User: nikunj
Functions: awk grep wget
Tags: awk grep meta
2

A Quick variation to the latest commands list with the new-lines skipped. This is faster to read.

for f in $(find /path/to/base -type f | grep -vw CVS); do grep -Hn PATTERN $f; done
ifconfig eth0 | grep "inet " | cut -d ':' -f2 | awk '{print $1}'
2010-06-29 00:06:08
User: jaimerosario
Functions: awk cut grep ifconfig
3

I've been using it in a script to build from scratch proxy servers.

zgrep -h "" `ls -tr access.log*`
2010-06-19 09:44:05
User: dooblem
Functions: zgrep
2

I use zgrep because it also parses non gzip files.

With ls -tr, we parse logs in time order.

Greping the empty string just concatenates all logs, but you can also grep an IP, an URL...

aptitude remove $(dpkg -l|egrep '^ii linux-(im|he)'|awk '{print $2}'|grep -v `uname -r`)
2010-06-10 21:23:00
User: dbbolton
Functions: awk egrep grep
8

This should do the same thing and is about 70 chars shorter.

grep -Eo \([0-9]\{1,3\}[\.]\)\{3\}[0-9] file | sort | uniq
dpkg -l | cut -d' ' -f 3 | grep ^python$
grep -P '\t' filename
2010-05-02 02:24:14
Functions: grep
Tags: grep
3

-P tells grep to use perl regex matches (only works on the GNU grep as far as I know.)

grep <something> logfile | cut -c2-18 | uniq -c
2010-04-29 11:26:09
User: buzzy
Functions: cut grep uniq
Tags: uniq grep cut
2

The cut should match the relevant timestamp part of the logfile, the uniq will count the number of occurrences during this time interval.

df -l | grep -e "9.%" -e "100%"
2010-04-26 17:57:54
User: dooblem
Functions: df grep
2

Reports all local partitions having more than 90% usage.

Just add it in a crontab and you'll get a mail when a disk is full.

(sending mail to the root user must work for that)

vim -r 2>&1 | grep '\.sw.' -A 5 | grep 'still running' -B 5
2010-04-17 19:43:35
User: rkulla
Functions: grep vim
3

Catches .swp, .swo, .swn, etc.

If you have access to lsof, it'll give you more compressed output and show you the associated terminals (e.g., pts/5, which you could then use 'w' to figure out where it's originating from): lsof | grep '\.sw.$'

If you have swp files turned off, you can do something like: ps x | grep '[g,v]im', but it won't tell you about files open in buffers, via :e [file].

for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
2010-04-11 23:14:18
User: TuxOtaku
Functions: awk file grep sed
4

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }
2010-04-07 08:45:26
User: AskApache
Functions: sed
-2
hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }

hb blinks, hc does a reverse color with background.. both very nice.

hc(){ sed "s/\($*\)/`tput setaf 0;tput setab 6`\1`tput sgr0`/gI"; }

Run this:

command ps -Hacl -F S -A f | hc ".*$PPID.*" | hb ".*$$.*"

Your welcome ;)

From my bash profile - http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" | while read FUNFACT; do notify-send -t $((1000+300*`echo -n $FUNFACT | wc -w`)) -i gtk-dialog-info "RandomFunFact" "$FUNFACT"; done
2010-04-02 09:43:32
User: mtron
Functions: grep read sed wc wget
2

extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

ffmpeg -f alsa -itsoffset 00:00:02.000 -ac 2 -i hw:0,0 -f x11grab -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -r 10 -i :0.0 -sameq -f mp4 -s wvga -y intro.mp4
2010-03-31 09:33:05
User: mohan43u
Functions: awk grep
4

Yet another x11grab using ffmpeg. I also added mic input to the capturing video stream using alsa. Yet I need to find out how to capture audio which is currently playing.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
2010-03-30 23:49:30
User: tali713
Functions: grep sed wget
13

Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

The file lastfact, can then be examined whenever the command fails.

du -kd | egrep -v "/.*/" | sort -n
2010-03-30 15:40:35
User: rmbjr60
Functions: du egrep sort
-1

Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.

alias dush="du -sm *|sort -n|tail"
2010-03-26 10:18:57
User: funky
Functions: alias
28

sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can

dush

for the above output,

dush -n 3

for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc