What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands tagged awk from sorted by
Terminal - Commands tagged awk - 297 results
apt-get -s upgrade | awk '/Inst.+/ {print $2}'
2013-03-25 21:23:11
User: lpanebr
Functions: apt awk
Tags: awk apt-get

Usefull if you only want to see the package names, or if you want to use them in a script.

find /some/path -type f | gawk -F/ '{print $NF}' | gawk -F. '/\./{print $NF}' | sort | uniq -c | sort -rn
2013-03-18 14:40:26
User: skkzsh
Functions: find gawk sort uniq

If you have GNU findutils, you can get only the file name with

find /some/path -type f -printf '%f\n'

instead of

find /some/path -type f | gawk -F/ '{print $NF}'
svn revert .
df | awk '{if ($2!=dspace) print "different"; dspace=$2;}'
ps auxw | grep sbin/apache | awk '{print"-p " $2}' | xargs strace -f
2013-02-19 19:14:57
User: msealand
Functions: awk grep ps strace xargs

This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.

sudo ifconfig wlan0 | grep inet | awk 'NR==1 {print $2}' | cut -c 6-
2013-02-18 14:10:07
User: mouths
Functions: awk cut grep ifconfig sudo

On wired connections set 'eth0' instead of 'wlan0'

awk '{for (i=9;i<=NF;i++) {printf "%s",$i; printf "%s", " ";}; printf "\n"}'
2013-02-12 13:57:43
User: adimania
Functions: awk printf
Tags: awk

It'll print the file names preserving the spaces in their names and adding new line after every new filename.

I wrote this to quickly find out how many files in any directory is owned by a particular user. This can be extended using pipe and grep to do much more.

find . -type f -size +100M
load=`uptime|awk -F',' '{print $3}'|awk '{print $3}'`; if [[ $(echo "if ($load > 1.0) 1 else 0" | bc) -eq 1 ]]; then notify-send "Load $load";fi
2013-02-06 08:30:24
User: adimania
Functions: awk echo

I run this via crontab every one minute on my machine occasionally to see if a process is eating up my system's resources.

curl http://en.wikipedia.org/wiki/List_of_programming_languages | grep "<li>" | awk -F"title=" '{ print $2 }' | awk -F\" '{ print $2 }'
2013-01-09 21:40:11
User: sxiii
Functions: awk grep

Requirements: curl, grep, awk, internet connection with access to wikipedia

Loaded page: http://en.wikipedia.org/wiki/List_of_programming_languages

If you can make shorter version of this listgetter, you are welcome to paste it here :)

find . -name "*.pdf" -exec pdftk {} dump_data output \; | grep NumberOfPages | awk '{s+=$2} END {print s}'
exipick -zi | xargs --max-args=1000 exim -Mrm
2012-12-12 20:46:22
User: jasen
Functions: xargs
Tags: bash awk exim

do 1000 at a time so that if your doodoo is deep you can avoid avoid "command-line too big" error

awk '/^md/ {printf "%s: ", $1}; /blocks/ {print $NF}' </proc/mdstat
find . -type f -print | awk -F'.' '{print $NF}' | sort | uniq -c
awk '{print NR "\t" $0}'
netstat -tn | awk '($4 ~ /:22\s*/) && ($6 ~ /^EST/) {print substr($5, 0, index($5,":"))}'
for file in `svn st | awk '{print $2}'`; do svn revert $file; done
sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
2012-09-11 14:51:10
User: sgowie
Functions: awk lastb sort sudo

The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP.

The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count.

The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk.

Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it.

netstat -an | grep 80 | wc -l
a=$(xwininfo |gawk 'BEGIN {FS="[x+ \t]*"} /-geometry/ {print int(($3+1)/2)*2"x"int(($4+1)/2)*2"+"$5"+"$6}') ; echo ${a} ; ffmpeg -f x11grab -s ${a} -r 10 -i :0.0 -sameq -f mp4 -s wvga -y /tmp/out.mpg
2012-08-31 14:48:41
User: dwygo
Functions: echo gawk

Now we can capture only a specific window (we have to chose by clicking on it)

ffmpeg complains about "Frame size must be a multiple of 2" so we calculate the upper even number with (g)awk trickery.

We remove the grep, we are already using (g)awk here ....why losing time with grep !!! ;)

tcpdump -ntr NAME_OF_CAPTURED_FILE.pcap 'tcp[13] = 0x02 and dst port 80' | awk '{print $4}' | tr . ' ' | awk '{print $1"."$2"."$3"."$4}' | sort | uniq -c | awk ' {print $2 "\t" $1 }'
sudo apt-get remove $(dpkg -l|awk '/^ii linux-image-/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic*//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic*\nlinux-image-%s-generic*\n",$0,$0,$0)}')
2012-08-15 10:02:12
User: mtron
Functions: awk sed sudo

Remove old kernels (*-generic and *-generic-pae) via apt-get on debian/ubuntu based systems. Tested on ubuntu 10.04 - 12.04.

cat /dev/urandom|od -t x1|awk 'NR > line { pos=int(rand()*15)+2;printf("%s",$pos);line=NR+(rand()*1000);digits = digits+2 } digits == 64 { print("\n");exit }'
2012-08-14 19:02:00
User: jetdillo
Functions: awk cat exit od

Use this the next time you need to come up with a reasonably random bitstring, like for a WPA/WPA2 PSK or something. Takes a continuous stream of bytes coming from /dev/urandom, runs it through od(1), picking a random field ($0 and $1 excluded) from a random line and then prints it.

egrep '.*(("STATUS)|("HEAD)).*' http_access.2012.07.18.log | awk '{sum+=$11; ++n} END {print "Tot="sum"("n")";print "Avg="sum/n}'
2012-07-27 12:18:29
User: fanchok
Functions: awk egrep

Depending on your Apache access log configuration you may have to change the sum+=$11 to previous or next awk token.

Beware, usually in access log last token is time of response in microseconds, penultimate token is size of response in bytes. You may use this command line to calculate sum and average of responses sizes.

You can also refine the egrep regexp to match specific HTTP requests.

hostname -I
2012-07-18 19:43:48
User: bashfan
Functions: hostname
Tags: awk IP ip address

That's the easiest way to do it. -I (or capital i) display all network addresses of a host