Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,111 results
arp -s $(route -n | awk '/^0.0.0.0/ {print $2}') \ $(arp -n | grep `route -n | awk '/^0.0.0.0/ {print $2}'`| awk '{print $3}')
awk '{sum+=$1; sumsq+=$1*$1} END {print sqrt(sumsq/NR - (sum/NR)**2)}' file.dat
seq 0 0.1 20 | awk '{print $1, cos(0.5*$1)*sin(5*$1)}' | graph -T X
2009-03-24 21:46:59
User: kaan
Functions: awk seq
2

The arguments of "seq" indicate the starting value, step size, and the end value of the x-range. "awk" outputs (x, f(x)) pairs and pipes them to "graph", which is part of the "plotutils" package.

seq 6 | awk '{for(x=1; x<=5; x++) {printf ("%f ", rand())}; printf ("\n")}'
2009-03-24 21:33:38
User: kaan
Functions: awk printf seq
Tags: awk seq
3

Displays six rows and five columns of random numbers between 0 and 1. If you need only one column, you can dispense with the "for" loop.

awk '{sum1+=$1; sum2+=$2} END {print sum1/NR, sum2/NR}' file.dat
2009-03-24 21:22:14
User: kaan
Functions: awk
Tags: awk
2

This example calculates the averages of column one and column two of "file.dat". It can be easily modified if other columns are to be averaged.

seq 50| awk 'BEGIN {a=1; b=1} {print a; c=a+b; a=b; b=c}'
2009-03-24 20:39:24
User: kaan
Functions: awk seq
Tags: awk seq
13

Another combination of seq and awk. Not very efficient, but sufficiently quick.

seq 100 | awk '{sum+=$1} END {print sum}'
2009-03-24 20:30:40
User: kaan
Functions: awk seq
Tags: awk seq
4

"seq 100" outputs 1,2,..,100, separated by newlines. awk adds them up and displays the sum.

"seq 1 2 11" outputs 1,3,..,11.

Variations:

1+3+...+(2n-1) = n^2

seq 1 2 19 | awk '{sum+=$1} END {print sum}' # displays 100

1/2 + 1/4 + ... = 1

seq 10 | awk '{sum+=1/(2**$1)} END {print sum}' # displays 0.999023
ps ax | grep <processname> | grep -v grep | awk '{print $1}' | sudo xargs kill -9
/usr/sbin/arp -i eth0 | awk '{print $3}' | sed 1d
ps aux | grep 'httpd ' | awk {'print $2'} | xargs kill -9
mysql --database=dbname -B -N -e "SHOW TABLES" | awk '{print "ALTER TABLE", $1, "CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;"}' | mysql --database=dbname &
2009-03-21 18:45:15
User: root
Functions: awk
Tags: mysql
17

This loops through all tables and changes their collations to UTF8. You should backup beforehand though in case some data is lost in the process.

sudo zcat /var/log/auth.log.*.gz | awk '/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a["*" $11]++}END{for (i in a) printf "%6s\t%s\n", a[i], i|"sort -n"}'
2009-03-21 06:41:59
Functions: awk printf sudo zcat
22

Show the number of failed tries of login per account. If the user does not exist it is marked with *.

ropened='p4 opened | awk -F# "{print \$1}" | p4 -x - revert'
alias opened='p4 opened | awk -F# "{print \$1}"'
2009-03-20 11:06:41
User: Alexander
Functions: alias awk
Tags: p4 SCM Perforce
0

Just type 'opened' and get all files currently opened for edit.

vos listvldb | agrep LOCKED -d RWrite | grep RWrite: | awk -F: '{print $2}' | awk '{printf("%s ",$1)} END {printf("\n")}'
2009-03-17 19:55:39
User: mpb
Functions: awk grep
0

This command shows if there are any locked AFS volumes.

The output is a list of AFS volume IDs (or nothing if there are none locked).

cat count.txt | awk '{ sum+=$1} END {print sum}'
2009-03-16 00:22:13
User: duxklr
Functions: awk cat
Tags: awk
15

Takes a input file (count.txt) that looks like:

1

2

3

4

5

It will add/sum the first column of numbers.

ls -1 | grep " " | awk '{printf("mv \"%s\" ",$0); gsub(/ /,"_",$0); printf("%s\n",$0)}' | sh # rename filenames: spaces to "_"
2009-03-15 18:42:43
User: mpb
Functions: awk grep ls rename sh
2

This command converts filenames with embedded spaces in the current directory replacing spaces with the underscore ("_") character.

ls -l|awk '{print $6,$8}'|sort -d
2009-03-13 19:00:18
User: archlich
Functions: awk ls sort
-4

Can pipe to tail or change the awk for for file size, groups, users, etc.

lsof|grep /somemount/| awk '{print $2}'|xargs kill
2009-03-12 18:42:19
User: archlich
Functions: awk grep xargs
4

This command will kill all processes using a directory. It's quick and dirty. One may also use a -9 with kill in case regular kill doesn't work. This is useful if one needs to umount a directory.

svn status | grep "^\?" | awk '{print $2}' | xargs svn add
2009-03-12 15:06:12
User: unixfu73000
Functions: awk grep xargs
Tags: svn
-1

This adds all new files to SVN recursively. It doesn't work for files that have spaces in their name, but why would you create a file with a space in its name in the first place?

svn status | grep ^? | awk '{print $2}' | xargs rm -rf
2009-03-10 17:01:40
User: Highwayman
Functions: awk grep rm xargs
1

Removes all unversioned files and folders from an svn repository. Also:

svn status --no-ignore | grep ^I | awk '{print $2}' | xargs rm -rf

will remove those files which svn status ignores. Handy to add to a script which is in your path so you can run it from any repository (a la 'svn_clean.sh').

grep Mar/2009 /var/log/apache2/access.log | awk '{ print $1 }' | sort -n | uniq -c | sort -rn | head
echo -n $mypass | md5sum | awk {'print $1'}
2009-03-10 13:12:21
User: tororebelde
Functions: awk echo md5sum
1

This was useful to generate random passwords to some webpage users, using the sample code, inside a bash script

awk '{ for (f = 1; f <= NF; f++) a[NR, f] = $f } NF > nf { nf = NF } END { for (f = 1; f <= nf; f++) for (r = 1; r <= NR; r++) printf a[r, f] (r==NR ? RS : FS) }'
2009-03-10 05:35:22
User: MSBF
Functions: awk printf
0

works the same as R's t()

sudo cat /proc/kcore | strings | awk 'length > 20' | less
2009-03-09 02:19:47
User: nesquick
Functions: awk cat strings sudo
Tags: cat ram strings
15

This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc).

Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here!

The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable").

If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...).

Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)...

sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})'

(depending on sudo settings it might be best to run

sudo su

first to get to a # prompt)