Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using grep from sorted by
Terminal - Commands using grep - 1,530 results
tr -c "[:digit:]" " " < /dev/urandom | dd cbs=$COLUMNS conv=unblock | GREP_COLOR="1;32" grep --color "[^ ]"
S=`pidof skype`;grep heap /proc/$S/maps|cut -f1 -d' '|awk -F- '{print "0x" $1 " 0x" $2}'|xargs echo "du me t ">l;gdb -batch -p $S -x l>/dev/null 2>&1;strings t|grep \(smirk|head -n1
2009-06-26 20:03:17
User: alvieboy
Functions: awk cut echo grep head xargs
0

Skype has an internal regex which depicts the emoticons it supports. However you cannot simply search the binary file for it. This small 181 character line will do just that, provided skype is running. And of course, only works in linux.

ps aux | grep -v `whoami`
echo "-------------" >> nicinfo.txt; echo "computer name x" >> nicinfo.txt; ifconfig | grep status >> nicinfo.txt; ifconfig | grep inet >> nicinfo.txt; ifconfig | grep ether >> nicinfo.txt; hostinfo | grep type >> nicinfo.txt;
while true; do { $(which logger) -p local4.notice `free -m | grep Mem`; sleep 60; } done &
2009-06-22 00:29:53
User: Neo23x0
Functions: grep sleep which
3

Uses logger in a while loop to log memory statistics frequently into the local syslog server.

svn status | grep '!' | sed 's/!/ /' | xargs svn del --force
cut -d: -f1 /etc/passwd | grep -vE "#" | xargs -i{} crontab -u {} -l
2009-06-18 16:49:52
User: hoberion
Functions: crontab cut grep xargs
1

additionally use "find /etc/cron*" for cronscripts

grep -v "^\W$" <filename>
2009-06-18 08:17:22
User: nikc
Functions: grep
Tags: grep non-empty
0

I had some trouble removing empty lines from a file (perhaps due to utf-8, as it's the source of all evil), \W did the trick eventually.

grep -2 -iIr "err\|warn\|fail\|crit" /var/log/*
2009-06-17 19:41:04
User: miketheman
Functions: grep
6

Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states

grep -h -o '<[^/!?][^ >]*' * | sort -u | cut -c2-
2009-06-17 00:22:18
User: thebodzio
Functions: cut grep sort
Tags: sort grep cut
2

This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

grep -Eio '([[:alnum:]_.]+@[[:alnum:]_]+?\.[[:alpha:].]{2,6})' file.html
2009-06-16 20:19:47
User: wires
Functions: grep
3

find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of

...|sort|uniq$

to filter them.

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split
3

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

sed -e "s/\[{/\n/g" -e "s/}, {/\n/g" sessionstore.js | grep url | awk -F"," '{ print $1 }'| sed -e "s/url:\"\([^\"]*\)\"/\1/g" -e "/^about:blank/d" > session_urls.txt
2009-06-14 15:08:31
User: birnam
Functions: awk grep sed
2

This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}

egrep -r '(render_message|multipart).*('`find app/views -name '*.erb' | grep mailer | sed -e 's/\..*//' -e 's/.*\///' | uniq | xargs | sed 's/ /|/g'`')' app/models
ldapsearch -H ldap://localhost:389 -D cn=username,ou=users,dc=domain -x -W -b ou=groups,dc=domain '(member=cn=username,ou=users,dc=domain)' | grep ^dn | sed "s/dn\: cn=\([^,]*\),ou=\([^,]*\),.*/\2 \1/"
2009-06-11 14:50:11
User: nitehawk
Functions: grep sed
-2

This expression looks for groups inside of a GroupOfNames class element, that is itself inside one (or many) Organizational Unit (ou) nodes in the ldap tree. Give you a quick dump of all the groups the user belongs to. Handy for displaying on a webpage.

if [ -z $(echo $var | grep [0-9]) ]; then echo "NON NUMERIC"; fi
2009-06-04 07:41:26
User: AnusJenkins
Functions: echo grep
2

use to execute a block of code only if $var is numeric

find . -name "*jpg" -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR"
2009-06-03 22:08:48
User: vincentp
Functions: find grep
11

Finds all corrupted jpeg files in current directory and its subdirectories. Displays the error or warning found.

The jpeginfo is part of the jpeginfo package in debian.

Should you wish to only get corrupted filenames, use cut to extract them :

find ./ -name *jpg -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR" | cut -d " " -f 1
URL=http://svn.example.org/project; diff -u <(TZ=UTC svn -q log -r1:HEAD $URL | grep \|) <(TZ=UTC svn log -q $URL | grep \| | sort -k3 -t \|)
2009-06-03 14:26:55
User: sunny256
Functions: diff grep sort
Tags: bash svn
2

Lists revisions in a Subversion repository with a timestamp that doesn't follow the revision numbering order. If everything is OK, nothing is displayed.

find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f
2009-06-03 09:47:08
User: mohan43u
Functions: cut file find grep sed tail xargs
Tags: tail
5

Works in Ubuntu, I hope it will work on all Linux machines. For Unixes, tail should be capable of handling more than one file with '-f' option.

This command line simply take log files which are text files, and not ending with a number, and it will continuously monitor those files.

Putting one alias in .profile will be more useful.

while [ i != 0 ]; do sleep 1 | dialog --clear --gauge "Quality: " 0 0 $(cat /proc/net/wireless | grep $WIRELESSINTERFACE | awk '{print $3}' | tr -d "."); done
2009-05-31 16:09:23
User: ncaio
Functions: awk cat grep sleep tr
1

The variable WIRELESSINTERFACE indicates your wireless interface

ioreg -lw0 | grep IODisplayEDID | sed "/[^<]*</s///" | xxd -p -r | strings -6
git grep -l "your grep string" | xargs gedit
grep -PL "\t" -r . | grep -v ".svn" | xargs sed -i 's/\t/ /g'
2009-05-28 08:52:14
User: root
Functions: grep sed xargs
3

Note that this assumes the application is an SVN checkout and so we have to throw away all the .svn files before making the substitution.

svn log fileName|cut -d" " -f 1|grep -e "^r[0-9]\{1,\}$"|awk {'sub(/^r/,"",$1);print "svn cat fileName@"$1" > /tmp/fileName.r"$1'}|sh
2009-05-27 02:11:58
User: fizz
Functions: awk cut grep
Tags: bash svn awk grep
2

exported files will get a .r23 extension (where 23 is the revision number)

lsof -nP +p 24073 | grep -i listen | awk '{print $1,$2,$7,$8,$9}'