Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged grep from sorted by
Terminal - Commands tagged grep - 336 results
df -l | grep -e "9.%" -e "100%"
2010-04-26 17:57:54
User: dooblem
Functions: df grep
2

Reports all local partitions having more than 90% usage.

Just add it in a crontab and you'll get a mail when a disk is full.

(sending mail to the root user must work for that)

vim -r 2>&1 | grep '\.sw.' -A 5 | grep 'still running' -B 5
2010-04-17 19:43:35
User: rkulla
Functions: grep vim
3

Catches .swp, .swo, .swn, etc.

If you have access to lsof, it'll give you more compressed output and show you the associated terminals (e.g., pts/5, which you could then use 'w' to figure out where it's originating from): lsof | grep '\.sw.$'

If you have swp files turned off, you can do something like: ps x | grep '[g,v]im', but it won't tell you about files open in buffers, via :e [file].

for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
2010-04-11 23:14:18
User: TuxOtaku
Functions: awk file grep sed
4

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }
2010-04-07 08:45:26
User: AskApache
Functions: sed
-2
hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }

hb blinks, hc does a reverse color with background.. both very nice.

hc(){ sed "s/\($*\)/`tput setaf 0;tput setab 6`\1`tput sgr0`/gI"; }

Run this:

command ps -Hacl -F S -A f | hc ".*$PPID.*" | hb ".*$$.*"

Your welcome ;)

From my bash profile - http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" | while read FUNFACT; do notify-send -t $((1000+300*`echo -n $FUNFACT | wc -w`)) -i gtk-dialog-info "RandomFunFact" "$FUNFACT"; done
2010-04-02 09:43:32
User: mtron
Functions: grep read sed wc wget
1

extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

ffmpeg -f alsa -itsoffset 00:00:02.000 -ac 2 -i hw:0,0 -f x11grab -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -r 10 -i :0.0 -sameq -f mp4 -s wvga -y intro.mp4
2010-03-31 09:33:05
User: mohan43u
Functions: awk grep
4

Yet another x11grab using ffmpeg. I also added mic input to the capturing video stream using alsa. Yet I need to find out how to capture audio which is currently playing.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
2010-03-30 23:49:30
User: tali713
Functions: grep sed wget
13

Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

The file lastfact, can then be examined whenever the command fails.

du -kd | egrep -v "/.*/" | sort -n
2010-03-30 15:40:35
User: rmbjr60
Functions: du egrep sort
-1

Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.

alias dush="du -sm *|sort -n|tail"
2010-03-26 10:18:57
User: funky
Functions: alias
28

sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can

dush

for the above output,

dush -n 3

for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc

du -hs *|grep M|sort -n
2010-03-25 19:20:24
User: tuxlifan
Functions: du grep sort
3

This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)...

A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type):

du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n

(NOTE: you might want to replace the ',' according to your locale!)

Don't forget that you can

modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash))

in its core similar to:

http://www.commandlinefu.com/commands/view/706/show-sorted-list-of-files-with-sizes-more-than-1mb-in-the-current-dir

perl -ne 'BEGIN{undef $/}; print "$ARGV\t$.\t$1\n" if m/(first line.*\n.*second line)/mg'
2010-03-18 15:46:10
User: hfs
Functions: perl
Tags: perl grep
7

Using perl you can search for patterns spanning several lines, a thing that grep can't do. Append the list of files to above command or pipe a file through it, just as with regular grep. If you add the 's' modifier to the regex, the dot '.' also matches line endings, useful if you don't known how many lines you need are between parts of your pattern. Change '*' to '*?' to make it greedy, that is match only as few characters as possible.

See also http://www.commandlinefu.com/commands/view/1764/display-a-block-of-text-with-awk to do a similar thing with awk.

Edit: The undef has to be put in a begin-block, or a match in the first line would not be found.

grep current_state= /var/log/nagios/status.dat|sort|uniq -c|sed -e "s/[\t ]*\([0-9]*\).*current_state=\([0-9]*\)/\2:\1/"|tr "\n" " "
ip link show
2010-03-01 20:10:27
User: d34dh0r53
Functions: link
Tags: ifconfig grep
6

I prefer the ip command to ifconfig as ifconfig is supposedly going to be deprecated. Certain IP address aliases can only be seen with the ip command (such as the ones applied by RHCS).

grep "^V<TAB>" your_file
2010-03-01 08:46:02
User: sathyz
Functions: grep
Tags: grep
9

mixing tabs and spaces for indentation in python would confuse the python interpreter, to avoid that, check if the file has any tab based indentation.

"^V" => denotes press control + v and press tab within quotes.

cat improper_indent.py

class Tux(object):

print "Hello world.."

grep " " improper_indent.py

print "Hello world.."

pgrep <name>
2010-02-28 22:59:33
User: alesplin
Tags: grep ps
-2

You'll need to install proctools. MacPorts and Fink have this if you're running Mac OS X, check your Linux distribution's repositories if it isn't installed by default.

ifconfig | awk '/HWaddr/ { print $NF }'
ifconfig -a| grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'
2010-02-27 14:30:43
User: evenme
Functions: grep ifconfig
Tags: ifconfig grep
3

Get mac address listed for all interfaces.

psgrep() { if [ ! -z $1 ] ; then echo "Grepping for processes matching $1..." ps aux | grep -i $1 | grep -v grep else echo "!! Need name to grep for" fi }
2010-02-27 13:47:28
User: evenme
Functions: echo grep ps
Tags: grep ps
-4

Grep for a named process.

SITE="www.google.com"; curl --silent "http://www.shadyurl.com/create.php?myUrl=$SITE&shorten=on" | awk -F\' '/is now/{print $6}'
sed -i 's/oldname@example.com/newname@example.com/g' `grep oldname@example.com -rl .`
2010-02-18 18:26:09
User: and3k
Functions: sed
0

Do a recursive (-r) search with grep for all files where your old mail address is mentioned (-l shows only the file names) and use sed to replace it with your new address. Works with other search/replacement patterns too.

pdftotext [file] - | grep 'YourPattern'
2010-02-14 21:42:35
User: drewk
Functions: grep
Tags: pipe grep pdf
27

PDF files are simultaneously wonderful and heinous. They are wonderful in being ubiquitous and mostly being cross platform. They are heinous in being very difficult to work with from the command line, search, grep, use only the text inside the PDF, or use outside of proprietary products.

xpdf is a wonderful set of PDF tools. It is on many linux distros and can be installed on OS X. While primarily an open PDF viewer for X, xpdf has the tool "pdftotext" that can extract formated or unformatted text from inside a PDF that has text. This text stream can then be further processed by grep or other tool. The '-' after the file name directs output to stdout rather than to a text file the same name as the PDF.

Make sure you use version 3.02 of pdftotext or later; earlier versions clipped lines.

The lines extracted from a PDF without the "-layout" option are very long. More paragraphs. Use just to test that a pattern exists in the file. With "-layout" the output resembles the lines, but it is not perfect.

xpdf is available open source at http://www.foolabs.com/xpdf/

find -type f -regex ".*\.\(js\|php\|inc\|htm[l]?\|css\)$" -exec grep -il 'searchstring' '{}' +
find . -type f \( -name "*.js" -o -name "*.php" -o -name "*.inc" -o -name "*.html" -o -name "*.htm" -o -name "*.css" \) -exec grep -il 'searchString' {} \;
2010-02-07 15:28:20
User: niels_bom
Functions: find grep
Tags: find grep search
-1

Use find to recursively make a list of all files from the current directory and downwards. The files have to have an extension of the ones listed. Then for every file found, grep it for 'searchString', returns the filename if searchString is found.

find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep
3

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/