Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,097 results
du -sb *|sort -nr|head|awk '{print $2}'|xargs du -sh
ip route show dev ppp0 | awk '{ print $7 }'
awk -F\" '{print $4}' *.log | grep -v "eviljaymz\|\-" | sort | uniq -c | awk -F\ '{ if($1>500) print $1,$2;}' | sort -n
2009-05-05 22:21:04
User: jaymzcd
Functions: awk grep sort uniq
1

This prints a summary of your referers from your logs as long as they occurred a certain number of times (in this case 500). The grep command excludes the terms, I add this in to remove results Im not interested in.

Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
2009-05-05 21:51:16
User: jaymzcd
Functions: awk egrep wc
0

I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter).

For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches).

ps -eo user,pcpu,pmem | tail -n +2 | awk '{num[$1]++; cpu[$1] += $2; mem[$1] += $3} END{printf("NPROC\tUSER\tCPU\tMEM\n"); for (user in cpu) printf("%d\t%s\t%.2f%\t%.2f%\n",num[user], user, cpu[user], mem[user]) }'
wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

p=$(netstat -nate 2>/dev/null | awk '/LISTEN/ {gsub (/.*:/, "", $4); if ($4 == "4444") {print $8}}'); for i in $(ls /proc/|grep "^[1-9]"); do [[ $(ls -l /proc/$i/fd/|grep socket|sed -e 's|.*\[\(.*\)\]|\1|'|grep $p) ]] && cat /proc/$i/cmdline && echo; done
2009-04-30 12:39:48
User: j0rn
Functions: awk cat grep ls netstat sed
-5

Ok so it's rellay useless line and I sorry for that, furthermore that's nothing optimized at all...

At the beginning I didn't managed by using netstat -p to print out which process was handling that open port 4444, I realize at the end I was not root and security restrictions applied ;p

It's nevertheless a (good ?) way to see how ps(tree) works, as it acts exactly the same way by reading in /proc

So for a specific port, this line returns the calling command line of every thread that handle the associated socket

w | egrep -v '(load|FROM)' | awk '{print $2}' | sed 's/^/tty/' | awk '{print "echo \"The Matrix has you...\" >> /dev/" $1}' | bash
2009-04-29 22:04:56
User: copremesis
Functions: awk egrep sed
-12

This works just like write or wall ... cept one thing the sender is anonymous ... if you really want to drive everyone insane replace echo \"The Matrix has you...\" with cat /dev/urandom

nice one to do on April fool's day

ps auxf | grep httpd | grep -v grep | grep -v defunct | awk '{sum=sum+$6}; END {print sum/1024}'
dpkg-query -l| grep -v "ii " | grep "rc " | awk '{print $2" "}' | tr -d "\n" | xargs aptitude purge -y
2009-04-28 19:25:53
User: thepicard
Functions: awk grep tr xargs
-3

This will, for an application that has already been removed but had its configuration left behind, purge that configuration from the system. To test it out first, you can remove the last -y, and it will show you what it will purge without actually doing it. I mean it never hurts to check first, "just in case." ;)

sudo aptitude purge `dpkg --get-selections | grep deinstall | awk '{print $1}'`
sudo apt-get remove --purge `dpkg -l | awk '{print $2}' | grep gnome` && apt-get autoremove
2009-04-28 10:34:42
User: kelevra
Functions: awk grep sudo
Tags: awk apt-get dpkg
-4

Useful for removes a package and its depends, for example to remove the gnome desktop environment, also configuration files will be removed, you should be carefully and sure that you want to do this.

ifconfig en1 | awk '/inet / {print $2}' | mail -s "hello world" email@email.com
2009-04-28 06:01:52
User: rez0r
Functions: awk ifconfig mail
9

This is useful if you have need to do port forwarding and your router doesn't assign static IPs, you can add it to a script in a cron job that checks if you IP as recently changed or with a trigger script.

This was tested on Mac OSX.

netstat -an | grep ESTABLISHED | awk '{print $5}' | awk -F: '{print $1}' | sort | uniq -c | awk '{ printf("%s\t%s\t",$2,$1) ; for (i = 0; i < $1; i++) {printf("*")}; print "" }'
2009-04-27 22:02:19
User: knassery
Functions: awk grep netstat sort uniq
45

Written for linux, the real example is how to produce ascii text graphs based on a numeric value (anything where uniq -c is useful is a good candidate).

awk 'BEGIN {srand()} {print int(rand()*1000000) "\t" $0}' FILE | sort -n | cut -f 2-
2009-04-19 20:04:58
User: udim
Functions: awk cut sort
-3

Replace FILE with a filename (or - for stdin).

find /path/to/my/files/ -type f -name "*txt*" | xargs du -k | awk 'BEGIN{x=0}{x=x+$1}END{print x}'
2009-04-16 14:17:04
Functions: awk du find xargs
2

Use the find command to match certain files and summarise their total size in KBytes.

awk '/sshd/ && /Failed/ {gsub(/invalid user/,""); printf "%-12s %-16s %s-%s-%s\n", $9, $11, $1, $2, $3}' /var/log/auth.log
2009-04-16 00:56:23
User: frailotis
Functions: awk printf
11

A variation of a script I found on this site and then slimmed down to just use awk. It displays all users who have attempted to login to the box and failed using SSH. Pipe it to the sort command to see which usernames have the most failed logins.

/sbin/ifconfig -a | awk '/(cast)/ { print $2 }' | cut -d':' -f2 | head -1
find | sed -e "s/^.*\///" | awk ' BEGIN { FS=""} { print NF " " $0 } ' | sort -nrf | head -10
cd ~/.mozilla/firefox/ && sqlite3 `cat profiles.ini | grep Path | awk -F= '{print $2}'`/formhistory.sqlite "select * from moz_formhistory" && cd - > /dev/null
2009-04-13 20:23:37
User: klipz
Functions: awk cd grep
19

When you fill a formular with Firefox, you see things you entered in previous formulars with same field names. This command list everything Firefox has registered. Using a "delete from", you can remove anoying Google queries, for example ;-)

svn status | grep "^\?" | awk '{print $2}' | xargs svn add
svn status | grep '^?' | awk '{ print $2; }' | xargs svn add
2009-04-10 21:55:37
Functions: awk grep xargs
Tags: svn awk xargs
1

Lists the local files that are not present in the remote repository (lines beginning with ?)

and add them.

grep "install " /var/log/dpkg.log | awk '{print $4}' | xargs apt-get -y remove --purge
history|awk '{print $2}'|awk 'BEGIN {FS="|"} {print $1}'|sort|uniq -c|sort -r
2009-04-05 13:40:56
User: kayowas
Functions: awk sort uniq
7

It will return a ranked list of your most commonly-entered commands using your command history

find . -type f -size +25000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'