Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using grep from sorted by
Terminal - Commands using grep - 1,624 results
for i in $(netstat --inet -n|grep ESTA|awk '{print $5}'|cut -d: -f1);do geoiplookup $i;done
2009-10-18 20:41:47
Functions: awk cut grep netstat
3

Sample command to obtain a list of geographic localization for established connections, extracted from netstat. Need geoiplookup command ( part of geoip package under CentOS)

x=1 ; while [ $x -le 10 ] ; do lynx -dump http://www.alexa.com/siteinfo/http://[YOUR WEBSITE] | grep Global | sed 's/ \|Global\|\,//g' >> /var/log/alexa-stats.txt ; sleep 5h ; done &
2009-10-17 13:48:05
User: felix001
Functions: grep sed sleep
0

This will record the Alexa Traffic Stats to a file and run every 5 hours.

-- www.fir3net.com --

ord () { seq 1 127 | while read i; do echo `chr $i` $i; done | grep "^$1 " | cut -c '3-' }
2009-10-16 21:54:01
User: infinull
Functions: cut echo grep read seq
0

uses the previous "chr" function and uses it to create the inverse function "ord" by brute force.

It's slow, It's inelegant, but it works.

I thought I needed ord/chr to do a cartesian cipher in shell script a whie ago, but eventualy I realized I could get fancy with tr and do the same thing...

perl -e '$i=0;while($i<10){open(WGET,qq/|xargs lynx -dump/);printf WGET qq{http://www.google.com/search?q=site:g33kinfo.com&hl=en&start=$i&sa=N},$i+=10}'|grep '\/\/g33kinfo.com\/'
2009-10-16 12:20:17
User: op4
Functions: grep perl xargs
Tags: web browser
0

not my cmd... found on the web

tail -f FILE | grep --color=always KEYWORD
h() { if [ -z "$1" ]; then history; else history | grep "$@"; fi; }
2009-10-13 21:49:37
User: haivu
Functions: grep
Tags: bash grep
6

Place this in your .bash_profile and you can use it two different ways. If you issue 'h' on its own, then it acts like the history command. If you issue:

h cd

Then it will display all the history with the word 'cd'

b="http://2010.utosc.com"; for p in $( curl -s $b/presentation/schedule/ | grep /presentation/[0-9]*/ | cut -d"\"" -f2 ); do f=$(curl -s $b$p | grep "/static/slides/" | cut -d"\"" -f4); if [ -n "$f" ]; then echo $b$f; curl -O $b$f; fi done
2009-10-11 17:28:46
User: danlangford
Functions: cut echo grep
Tags: curl cut for UTOSC
2

miss a class at UTOSC2010? need a refresher? use this to curl down all the presentations from the UTOSC website. (http://2010.utosc.com) NOTE/WARNING this will dump them in the current directory and there are around 37 and some are big - tested on OSX10.6.1

ps -ef | grep pmon
netstat -an|grep -ci "tcp.*established"
2009-10-09 01:08:18
User: romulusnr
Functions: grep netstat
3

If you want prepend/append text just wrap in echo:

echo Connected: `netstat -an|grep -ci "tcp.*established"`
grep $'\t' file.txt
ls -F|grep /
2009-10-08 16:35:15
User: romulusnr
Functions: grep ls
-1

No need for -l and the output can be sent directly into another function expecting directory names.

find . -iname ".project"| xargs -I {} dirname {} | LC_ALL=C xargs -I {} svn info {} | grep "Last Changed Rev\|Path" | sed "s/Last Changed Rev: /;/" | sed "s/Path: //" | sed '$!N;s/\n//'
2009-10-07 16:13:27
User: hurz
Functions: dirname find grep info sed xargs
0

Searches for all .project files in current folder and below and uses "svn info" to get the last changed revision. The last sed joins every two lines.

cvs -n update 2>null | grep -i "M " | sed s/"M "//
netstat -anp --tcp --udp | grep LISTEN
sqlite3 mydb.sqlite3 '.dump' | grep -vE '^(BEGIN|COMMIT|CREATE|DELETE)|"sqlite_sequence"' | sed -r 's/"([^"]+)"/`\1`/' | tee mydb.sql | mysql -p mydb
2009-10-02 14:40:51
User: mislav
Functions: grep sed tee
Tags: mysql sqlite dump
0

Filters out all non-insert SQL operations (we couldn't filter out only lines starting with "INSERT" because inserts can span multiple lines), quotes table names with backticks, saves dump to a file and pipes it straight to mysql.

This transfers only data--it expects your schema is already in place. In Ruby on Rails, you can easily recreate the schema in MySQL with "rake db:schema:load RAILS_ENV=production".

md5sum --check MD5SUMS | grep -v ": OK"
2009-10-02 05:21:17
User: gpenguin
Functions: grep md5sum
6

All valid files are withheld so only failures show up. No output, all checks good.

sh -c 'S=askapache R=htaccess; find . -mount -type f|xargs -P5 -iFF grep -l -m1 "$S" FF|xargs -P5 -iFF sed -i -e "s%${S}%${R}%g" FF'
9

I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server.

So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg.

Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time.

expdp user/password FLASHBACK_SCN=$(echo -e "select current_scn from v\$database;" | sqlplus / as sysdba 2>/dev/null| grep [0-9][0-9][0-9][0-9][0-9][0-9]*)
2009-10-01 08:55:20
User: peshay
Functions: as echo grep
0

Creates a consistent datapumpt export on an Oracle database with the current sequence number, while the system is running and changes happens on the database.

check_dns_no() { for i in $* ; do if `wget -O - -q http://www.norid.no/domenenavnbaser/whois/?query=$i.no | grep "no match" &>/dev/null` ; then echo $i.no "available" ; fi ; sleep 1 ;done }
2009-09-30 21:17:33
User: xeor
Functions: echo grep sleep
Tags: wget dig dns
0

Mostly for Norwegians, but easily adoptable to others. Very handy if you are brainstorming for a new domainname.

Will only display the available ones..

You can usually do this better with dig, but if you dont have dig, or the TLD only have an online service to check with, this will be usefull..

strings /boot/kernel-file | grep 2.6
2009-09-30 06:21:40
Functions: grep strings
-10

recently some in the #linux shared this. to find out the kernel version name from the binary without using uname

while true; do curl -s http://www.commandlinefu.com/commands/view/3643/log-a-commands-votes | grep 'id="num-votes-' | sed 's;.*id="num-votes-[0-9]*">\([0-9\-]*\)</div>;\1;' >> votes; sleep 10; done
2009-09-26 00:55:24
User: matthewbauer
Functions: grep sleep
0

Log a command's votes,

then run:

gnuplot -persist <(echo "plot 'votes' with lines")
grep . filename > newfilename
wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
grep -v "^$" filename > newfilename
2009-09-24 12:21:43
User: eastwind
Functions: grep
1

The ^$ within the quotes is a regular expression: ^=beginning of line, $=end of line, with no characters between.

sudo lshw -C cpu|grep width