What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands tagged awk from sorted by
Terminal - Commands tagged awk - 303 results
ifconfig | awk -F: '/inet addr:/ { sub(/\.[^.]+$/, "", $2); if (!seen[$2]++ && $2 != "127.0.0") print $2 }'
2016-09-01 14:02:43
User: chrismccoy
Functions: awk ifconfig
Tags: ifconfig awk

For machines that have many ip blocks spanning different Class C's, this will show which ones.

ps auxw | grep -E 'sbin/(apache|httpd)' | awk '{print"-p " $2}' | xargs strace -F
2016-08-04 10:59:58
User: gormux
Functions: awk grep ps strace xargs
Tags: awk grep ps strace

Will open strace on all apache process, on systems using sbin/apache (debian) or sbin/httpd (redhat), and will follow threads newly created.

awk 'BEGIN{srand()} match($0, /DELTA=([0-9]+);/, a) {w[i++]=a[1]} END {print w[int(rand()*i)]}' file.name
2015-11-13 17:56:34
User: jkirchartz
Functions: awk
Tags: awk regex random

seed the random number generator,

find all matches in a file

put all matches from the capture group into an array

return a random element from the array

F=bigdata.xz; lsof -o0 -o -Fo $F | awk -Ft -v s=$(stat -c %s $F) '/^o/{printf("%d%%\n", 100*$2/s)}'
2015-09-19 22:22:43
User: flatcap
Functions: awk stat

Imagine you've started a long-running process that involves piping data,

but you forgot to add the progress-bar option to a command.


xz -dc bigdata.xz | complicated-processing-program > summary


This command uses lsof to see how much data xz has read from the file.

lsof -o0 -o -Fo FILENAME

Display offsets (-o), in decimal (-o0), in parseable form (-Fo)

This will output something like:






Process id (p), File Descriptor (f), Offset (o)


We stat the file to get its size

stat -c %s FILENAME


Then we plug the values into awk.

Split the line at the letter t: -Ft

Define a variable for the file's size: -s=$(stat...)

Only work on the offset line: /^o/


Note this command was tested using the Linux version of lsof.

Because it uses lsof's batch option (-F) it may be portable.


Thanks to @unhammer for the brilliant idea.

watch "awk '/Rss/{sum += \$2; } END{print sum, \"kB\"}' < /proc/$(pidof firefox)/smaps"
2015-09-19 00:36:34
User: gumnos
Functions: watch

Sometimes top/htop don't give the fine-grained detail on memory usage you might need. Sum up the exact memory types you want

du -x --max-depth=1|sort -rn|awk -F / -v c=$COLUMNS 'NR==1{t=$1} NR>1{r=int($1/t*c+.5); b="\033[1;31m"; for (i=0; i<r; i++) b=b"#"; printf " %5.2f%% %s\033[0m %s\n", $1/t*100, b, $2}'|tac
2015-09-12 10:36:49
Functions: awk du printf sort

A more efficient way, with reversed order to put the focus in the big ones.

xmlpager() { xmlindent "$@" | awk '{gsub(">",">'`tput setf 4`'"); gsub("<","'`tput sgr0`'<"); print;} END {print "'`tput sgr0`'"}' | less -r; }
2015-07-12 09:22:10
User: hackerb9
Functions: awk less

Don't want to open up an editor just to view a bunch of XML files in an easy to read format? Now you can do it from the comfort of your own command line! :-) This creates a new function, xmlpager, which shows an XML file in its entirety, but with the actual content (non-tag text) highlighted. It does this by setting the foreground to color #4 (red) after every tag and resets it before the next tag. (Hint: try `tput bold` as an alternative). I use 'xmlindent' to neatly reflow and indent the text, but, of course, that's optional. If you don't have xmlindent, just replace it with 'cat'. Additionally, this example shows piping into the optional 'less' pager; note the -r option which allows raw escape codes to be passed to the terminal.

awk '{for(i=2;i<=NF;i++) printf("%s%s",$i,(i!=NF)?OFS:ORS)}'
awk '{ $1="";print}'
tr -s ' ' | cut -d' ' -f2-
lsof -i -n -P | grep -e "$(ps aux | grep node | grep -v grep | awk -F' ' '{print $2}' | xargs | awk -F' ' '{str = $1; for(i = 2; i < NF; i++) {str = str "\\|" $i} print str}')"
2015-02-14 23:24:00
User: hochmeister
Functions: grep

us lsof, grep for any pid matching a given name such as "node".

tail -f access_log | awk '{print $1 , $12}'
2014-12-24 14:15:52
User: tyzbit
Functions: awk tail

Use this command to watch apache access logs in real time to see what pages are getting hit.

ip a s eth0 | awk -F'[/ ]+' '/inet[^6]/{print $3}'
ip addr show enp3s0 | awk '/inet[^6]/{print $2}' | awk -F'/' '{print $1}'
for i in `cat hosts_list`; do RES=`ssh myusername@${i} "ps -ef " |awk '/[p]rocessname/ {print $2}'`; test "x${RES}" = "x" && echo $i; done
2014-10-03 14:57:54
User: arlequin
Functions: awk echo test
Tags: ssh awk test ps

Given a hosts list, ssh one by one and echo its name only if 'processname' is not running.

tcpdump -tnn -c 2000 -i eth0 | awk -F "." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr | awk ' $1 > 10 '
2014-09-26 01:15:23
User: hochmeister
Functions: awk sort tcpdump uniq

capture 2000 packets and print the top 10 talkers

finger $(whoami) | egrep -o 'Name: [a-zA-Z0-9 ]{1,}' | cut -d ':' -f 2 | xargs echo
2014-09-24 01:22:07
User: swebber
Functions: cut egrep finger xargs

Its possible to user a simple regex to extract de username from the finger command.

The final echo its optional, just for remove the initial space

find . -name "*.pdf" -print0 | xargs -r0 stat -c %y\ %n | sort|awk '{print $4}'|gawk 'BEGIN{ a=1 }{ printf "mv %s %04d.pdf\n", $0, a++ }' | bash
2014-09-23 06:40:45
Functions: awk find gawk printf stat xargs
Tags: sort awk find xargs

Caution: distructive overwrite of filenames

Useful for concatenating pdfs in date order using pdftk

du -sm *| sort -nr | awk '{ size=4+5*int($1/5); a[size]++ }; END { print "size(from->to) number graph"; for(i in a){ printf("%d %d ",i,a[i]) ; hist=a[i]; while(hist>0){printf("#") ; hist=hist-5} ; printf("\n")}}'
2014-08-19 14:43:20
User: higuita
Functions: awk du sort
Tags: awk

This command makes a small graph with the histogram of size blocks (5MB in this example), not individual files. Fine tune the 4+5*int($1/5) block for your own size jumps : jump-1+jump*($1/jump)

Also in the hist=hist-5 part, tune for bigger or smaller graphs

YEAR=2015; echo Jul $(ncal 7 $YEAR | awk '/^Fr/{print $NF}')
2014-08-17 11:12:09
User: andreasS
Functions: awk echo
Tags: awk date

Calculate the date of Sysadmin day (last Friday of July) of any given year

awp () { awk '{print $'$1'}'; }
system_profiler SPHardwareDataType | awk '/UUID/ { print $3; }'
2014-07-25 06:54:40
Functions: awk

Gets the Hardware UUID of the current machine using system_profiler.

svn status | awk -F" " '{ for (i=2; i<=NF; i++) print "ls -ld \""$i"\""}' | sh
2014-07-09 09:41:24
User: kln0thing
Functions: awk
Tags: svn awk ls

The AWK part of the code will "collate" the fields from 2nd to Nth field (this is to handle any svn directories that may have spaces in them - typical when working with code that is interchangeably used with windows environment - for example, documentation teams) - the output is passed to "ls -ld" - the -d option to ls will tell ls to handle directories itself, rather than do ls on the directory. The '-p' option is just for pretty printing directories, links and executables (for added readability).

Finally, the entire "constructed" command will be passed onto sh for shell execution.

awk '/text to grep/{print $1}' "log" | sort -n | uniq -c | sort -rn | head -n 100
2014-07-09 08:48:06
User: kln0thing
Functions: awk head sort uniq

Original command: cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100

This is a waste of multiple cats and greps, esp when awk is being used

mco ping | head -n -4 | awk '{print $1}' | sort