Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 276 results
svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
2009-06-05 14:07:28
User: jemptymethod
Functions: awk sort
Tags: svn awk log
3

just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module

awk '/match/{print NR}' file
cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
2009-06-02 15:51:51
User: lrvick
Functions: awk cat
0

Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob?

It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt.

I ran into this today in that my only copy of a CSS file was "compressed" with no newlines.

I whipped this up and it converted back into nice human readable CSS :-)

It could be nicer, but it does the job.

awk 'BEGIN{srand()}{print rand(),$0}' SOMEFILE | sort -n | cut -d ' ' -f2-
2009-05-29 01:20:50
User: axelabs
Functions: awk cut sort
Tags: sort awk random
4

This appends a random number as a first filed of all lines in SOMEFILE then sorts by the first column and finally cuts of the random numbers.

awk 'BEGIN{size=5} {mod=NR%size; if(NR<=size){count++}else{sum-=array[mod]};sum+=$1;array[mod]=$1;print sum/count}' file.dat
2009-05-29 00:07:24
User: mungewell
Functions: awk
2

Sometimes jittery data hides trends, performing a rolling average can give a clearer view.

svn log fileName|cut -d" " -f 1|grep -e "^r[0-9]\{1,\}$"|awk {'sub(/^r/,"",$1);print "svn cat fileName@"$1" > /tmp/fileName.r"$1'}|sh
2009-05-27 02:11:58
User: fizz
Functions: awk cut grep
Tags: bash svn awk grep
2

exported files will get a .r23 extension (where 23 is the revision number)

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

vmstat 1 10 | /usr/xpg4/bin/awk -f ph-vmstat.awk
2009-05-04 04:55:00
User: MarcoN
Functions: vmstat
5

% cat ph-vmstat.awk

# Return human readable numbers

function hrnum(a) {

b = a ;

if (a > 1000000) { b = sprintf("%2.2fM", a/1000000) ; }

else if (a > 1000) { b = sprintf("%2.2fK", a/1000) ; }

return(b) ;

}

# Return human readable storage

function hrstorage(a) {

b = a ;

if (a > 1024000) { b = sprintf("%2.2fG", a/1024/1024) ; }

else if (a > 1024) { b = sprintf("%2.2fM", a/1024) ; }

return(b) ;

}

OFS=" " ;

$1 !~ /[0-9].*/ {print}

$1 ~ /[0-9].*/ {

$4 = hrstorage($4) ;

$5 = hrstorage($5) ;

$9 = hrnum($9) ;

$10 = hrnum($10) ;

$17 = hrnum($17) ;

$18 = hrnum($18) ;

$19 = hrnum($19) ;

print ;

}

p=$(netstat -nate 2>/dev/null | awk '/LISTEN/ {gsub (/.*:/, "", $4); if ($4 == "4444") {print $8}}'); for i in $(ls /proc/|grep "^[1-9]"); do [[ $(ls -l /proc/$i/fd/|grep socket|sed -e 's|.*\[\(.*\)\]|\1|'|grep $p) ]] && cat /proc/$i/cmdline && echo; done
2009-04-30 12:39:48
User: j0rn
Functions: awk cat grep ls netstat sed
-5

Ok so it's rellay useless line and I sorry for that, furthermore that's nothing optimized at all...

At the beginning I didn't managed by using netstat -p to print out which process was handling that open port 4444, I realize at the end I was not root and security restrictions applied ;p

It's nevertheless a (good ?) way to see how ps(tree) works, as it acts exactly the same way by reading in /proc

So for a specific port, this line returns the calling command line of every thread that handle the associated socket

dpkg-query -l| grep -v "ii " | grep "rc " | awk '{print $2" "}' | tr -d "\n" | xargs aptitude purge -y
2009-04-28 19:25:53
User: thepicard
Functions: awk grep tr xargs
-3

This will, for an application that has already been removed but had its configuration left behind, purge that configuration from the system. To test it out first, you can remove the last -y, and it will show you what it will purge without actually doing it. I mean it never hurts to check first, "just in case." ;)

sudo apt-get remove --purge `dpkg -l | awk '{print $2}' | grep gnome` && apt-get autoremove
2009-04-28 10:34:42
User: kelevra
Functions: awk grep sudo
Tags: awk apt-get dpkg
-4

Useful for removes a package and its depends, for example to remove the gnome desktop environment, also configuration files will be removed, you should be carefully and sure that you want to do this.

ifconfig en1 | awk '/inet / {print $2}' | mail -s "hello world" email@email.com
2009-04-28 06:01:52
User: rez0r
Functions: awk ifconfig mail
9

This is useful if you have need to do port forwarding and your router doesn't assign static IPs, you can add it to a script in a cron job that checks if you IP as recently changed or with a trigger script.

This was tested on Mac OSX.

svn status | grep '^?' | awk '{ print $2; }' | xargs svn add
2009-04-10 21:55:37
Functions: awk grep xargs
Tags: svn awk xargs
1

Lists the local files that are not present in the remote repository (lines beginning with ?)

and add them.

awk '{print > $3".txt"}' FILENAME
2009-03-31 15:14:13
User: alperyilmaz
Functions: awk
2

This command will sort the contents of FILENAME by redirecting the output to individual .txt files in which 3rd column will be used for sorting. If FILENAME contents are as follows:

foo foo A foo

bar bar B bar

lorem ipsum A lorem

Then two files called A.txt and B.txt will be created and their contents will be:

A.txt

foo foo A foo

lorem ipsum A lorem

and B.txt will be

bar bar B bar

netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
2009-03-28 21:02:26
User: tiagofischer
Functions: awk cut netstat sort uniq
14

Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount.

BSD Version:

netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr
awk '{sum+=$1; sumsq+=$1*$1} END {print sqrt(sumsq/NR - (sum/NR)**2)}' file.dat
seq 0 0.1 20 | awk '{print $1, cos(0.5*$1)*sin(5*$1)}' | graph -T X
2009-03-24 21:46:59
User: kaan
Functions: awk seq
2

The arguments of "seq" indicate the starting value, step size, and the end value of the x-range. "awk" outputs (x, f(x)) pairs and pipes them to "graph", which is part of the "plotutils" package.

seq 6 | awk '{for(x=1; x<=5; x++) {printf ("%f ", rand())}; printf ("\n")}'
2009-03-24 21:33:38
User: kaan
Functions: awk printf seq
Tags: awk seq
3

Displays six rows and five columns of random numbers between 0 and 1. If you need only one column, you can dispense with the "for" loop.

awk '{sum1+=$1; sum2+=$2} END {print sum1/NR, sum2/NR}' file.dat
2009-03-24 21:22:14
User: kaan
Functions: awk
Tags: awk
2

This example calculates the averages of column one and column two of "file.dat". It can be easily modified if other columns are to be averaged.

seq 50| awk 'BEGIN {a=1; b=1} {print a; c=a+b; a=b; b=c}'
2009-03-24 20:39:24
User: kaan
Functions: awk seq
Tags: awk seq
13

Another combination of seq and awk. Not very efficient, but sufficiently quick.

seq 100 | awk '{sum+=$1} END {print sum}'
2009-03-24 20:30:40
User: kaan
Functions: awk seq
Tags: awk seq
4

"seq 100" outputs 1,2,..,100, separated by newlines. awk adds them up and displays the sum.

"seq 1 2 11" outputs 1,3,..,11.

Variations:

1+3+...+(2n-1) = n^2

seq 1 2 19 | awk '{sum+=$1} END {print sum}' # displays 100

1/2 + 1/4 + ... = 1

seq 10 | awk '{sum+=1/(2**$1)} END {print sum}' # displays 0.999023
cat file.txt | sort | uniq -dc
2009-03-21 18:15:14
User: Vadi
Functions: cat sort uniq
1

Displays the duplicated lines in a file and their occuring frequency.

sudo zcat /var/log/auth.log.*.gz | awk '/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a["*" $11]++}END{for (i in a) printf "%6s\t%s\n", a[i], i|"sort -n"}'
2009-03-21 06:41:59
Functions: awk printf sudo zcat
21

Show the number of failed tries of login per account. If the user does not exist it is marked with *.

cat count.txt | awk '{ sum+=$1} END {print sum}'
2009-03-16 00:22:13
User: duxklr
Functions: awk cat
Tags: awk
15

Takes a input file (count.txt) that looks like:

1

2

3

4

5

It will add/sum the first column of numbers.

zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
2009-03-03 13:45:56
User: dbart
Functions: awk sort uniq zgrep
8

This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system.