What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands using awk from sorted by
Terminal - Commands using awk - 1,203 results
awk '{$1=""; print}'
vmstat 2 10 | awk 'NR > 2 {print NR, $13}' | gnuplot -e "set terminal png;set output 'v.png';plot '-' u 1:2 t 'cpu' w linespoints;"
$ awk '{printf "select * from table where id = %c%s%c;\n",39,$1,39; }' inputfile.txt
2009-09-21 14:08:04
User: alvinx
Functions: awk

inputfile.txt is a space-separated textfile, 1st column contains the items (id) I want to put into my SQL statement.

39 = charactercode for single tick '

1 = first column

If inputfile.txt is a CSV-file separated by "," use FS= to define your own field-separator:

awk 'BEGIN {FS=","; }{printf "select * from table where id = %c%s%c;\n",39,$1,39; }' inputfile.txt
netstat -an | awk '$1 ~ /[Tt][Cc][Pp]/ && $NF ~ /ESTABLISHED/{i++}END{print "Connected:\t", i}'
xev -id `xwininfo | grep 'Window id' | awk '{print $4}'`
2009-09-19 22:47:16
User: ktoso
Functions: awk grep

After executing this, click on a window you want to track X Window events in.

Explaination: "xev will track events in the window with the following -id, which we get by greping window information obtained by xwininfo"

netstat -lantp | grep -i stab | awk -F/ '{print $2}' | sort | uniq
2009-09-19 14:54:31
User: ProMole
Functions: awk grep netstat sort

Show apps that use internet connection at the moment.

Can be used to discover what programms create internet traffic. Skip the part after awk to get more details, though it will not work showing only unique processes.

This version will work with other languages such as Spanish and Portuguese, if the word for "ESTABLISHED" still contain the fragment "STAB"(e.g. "ESTABELECIDO")

netstat -lantp | grep -i establ | awk -F/ '{print $2}' | sort | uniq
netstat -lantp | grep -i establ | awk -F/ '{print $2}' | uniq | sort
2009-09-19 13:54:36
User: ktoso
Functions: awk grep netstat uniq

Can be used to discover what programms create internet traffic. Skip the part after awk to get more details.

Has anyone an idea why the uniq doesn't work propperly here (see sample output)?

git diff --numstat | awk '{if ($1 == "0" && $2 == "0") print $3}' | xargs git checkout HEAD
2009-09-17 22:12:50
User: lingo
Functions: awk diff xargs

I sometimes (due to mismanagement!) end up with files in a git repo which have had their modes changed, but not their content. This one-liner lets me revert the mode changes, while leaving changed-content files be, so I can commit just the actual changes made.

ls -lt|grep ^-|awk 'NR>5 { print $8 }'|xargs -r rm
ls -t | awk 'NR>5 {system("rm \"" $0 "\"")}'
2009-09-16 04:58:08
User: haivu
Functions: awk ls
Tags: awk ls

I have a directory containing log files. This command delete all but the 5 latest logs. Here is how it works:

* The ls -t command list all files with the latest ones at the top

* The awk's expression means: for those lines greater than 5, delete.

awk '{delta = $1 - avg; avg += delta / NR; mean2 += delta * ($1 - avg); } END { print sqrt(mean2 / NR); }'
2009-09-11 04:46:01
User: ashawley
Functions: awk delta
Tags: awk

This will calculate a running standard deviation in one pass and should never have the possibility for overflow that can happen with other implementations. I suppose there is a potential for underflow in the corner case where the deltas are small or the values themselves are small.

for IP in $(/sbin/ifconfig | fgrep addr: | sed 's/.*addr:\([[0-9.]*\) .*/\1/') ; do host $IP | awk '{print $5}'; done
awk '{avg += ($1 - avg) / NR;} END { print avg; }'
2009-09-10 17:06:03
User: ashawley
Functions: awk

This is an on-line algorithm for calculating the mean value for numbers in a column. Also known as "running average" or "moving average".

awk 'length>72' file
2009-09-10 05:54:41
User: haivu
Functions: awk
Tags: awk

This command displays a list of lines that are longer than 72 characters. I use this command to identify those lines in my scripts and cut them short the way I like it.

ssh root@`for ((i=100; i<=110; i++));do arp -a 192.168.1.$i; done | grep 00:35:cf:56:b2:2g | awk '{print $2}' | sed -e 's/(//' -e 's/)//'`
2009-09-09 04:32:20
User: gean01
Functions: arp awk grep sed ssh

Connect to a machine running ssh using mac address by using the "arp" command

echo src::${PATH} | awk 'BEGIN{pwd=ENVIRON["PWD"];RS=":";FS="\n"}!$1{$1=pwd}$1!~/^\//{$1=pwd"/"$1}{print $1}'
2009-09-09 04:03:46
User: arcege
Functions: awk echo
Tags: awk echo PATH

Removes trailing newline; colon becomes record separator and newline becomes field separator, only the first field is ever printed. Replaces empty entries with $PWD. Also prepend relative directories (like ".") with the current directory ($PWD). Can change PWD with env(1) to get tricky in (non-Bourne) scripts.

find . -name \*.c | xargs wc -l | tail -1 | awk '{print $1}'
2009-09-08 08:25:45
User: karpoke
Functions: awk find tail wc xargs
Tags: awk find wc

This is really fast :)

time find . -name \*.c | xargs wc -l | tail -1 | awk '{print $1}'


real 0m0.191s

user 0m0.068s

sys 0m0.116s

ls -ldct /lost+found |awk '{print $6, $7}'
curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/<title>\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 - \1/p"
2009-09-07 21:56:40
User: postrational
Functions: awk sed tr

Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.

For some reason sed gets stuck on OS X, so here's a Perl version for the Mac:

curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/'

If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this:

curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/'
fuser -k `who -u | awk '$6 == "old" { print "/dev/"$2'}`
2009-09-07 03:36:43
User: lbonanomi
Functions: awk fuser
Tags: Linux solaris

Shell timeout variables (TMOUT) can be very liberal about what is classified as 'activity', like having an editor open. This command string will terminate the login shell for an user with more than a day's idle time.

pkg search SEARCH_TERM | awk '{print $NF}' | sed -e 's;.*/\(.*\)\@.*;\1;' | sort -u
awk 'BEGIN {a=1;b=1;for(i=0;i<'${NUM}';i++){print a;c=a+b;a=b;b=c}}'
2009-09-06 03:05:55
User: arcege
Functions: awk
Tags: awk

Does not require input to function or complete. Number of iterations controlled by shell variable $NUM.

ls -lct /etc/ | tail -1 | awk '{print $6, $7, $8}'
2009-09-04 16:52:50
User: peshay
Functions: awk ls tail

shows also time if its the same year or shows year if installed before actual year and also works if /etc is a link (mac os)

find . -type f -name '*.c' -exec wc -l {} \; | awk '{sum+=$1} END {print sum}'
2009-09-04 15:51:30
User: arcege
Functions: awk find wc
Tags: awk find wc

Have wc work on each file then add up the total with awk; get a 43% speed increase on RHEL over using "-exec cat|wc -l" and a 67% increase on my Ubuntu laptop (this is with 10MB of data in 767 files).