Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,145 results
tcpdump -tnn -c 2000 -i eth0 | awk -F "." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr | awk ' $1 > 10 '
2014-09-26 01:15:23
User: hochmeister
Functions: awk sort tcpdump uniq
1

capture 2000 packets and print the top 10 talkers

for line in `docker ps | awk '{print $1}' | grep -v CONTAINER`; do docker ps | grep $line | awk '{printf $NF" "}' && echo $(( `cat /sys/fs/cgroup/memory/docker/$line*/memory.usage_in_bytes` / 1024 / 1024 ))MB ; done
find . -name "*.pdf" -print0 | xargs -r0 stat -c %y\ %n | sort|awk '{print $4}'|gawk 'BEGIN{ a=1 }{ printf "mv %s %04d.pdf\n", $0, a++ }' | bash
2014-09-23 06:40:45
Functions: awk find gawk printf stat xargs
Tags: sort awk find xargs
0

Caution: distructive overwrite of filenames

Useful for concatenating pdfs in date order using pdftk

awk '$2 == "/media/KINGSTON" {print $1}' /etc/mtab
docker images | grep '' | awk '{print $3}' | xargs docker rmi
docker ps -a | grep 'Exit' | awk '{print $1}' | xargs docker rm
ifconfig | awk '/HWaddr/ {printf "mac: %s addr: %s\n", $5, $1}'
ifconfig | egrep -A2 "eth|wlan" | tr -d "\n"| sed 's/\-\-/\n/g'|awk '{print "mac: "$5 " " $7}' | sed 's/addr:/addr: /g'
for I in *.CR2; do if [ `exiv2 pr -p a -u $I | grep 'xmp.Rating' | awk '{ print $4 }'` == "1" ]; then echo $I; fi; done
dd if=/dev/random count=1 bs=2 2>/dev/null | od -i | awk '{print $2}' | head -1
cat /var/log/syslog | grep score= | awk '{print $15}' | more
du -sm *| sort -nr | awk '{ size=4+5*int($1/5); a[size]++ }; END { print "size(from->to) number graph"; for(i in a){ printf("%d %d ",i,a[i]) ; hist=a[i]; while(hist>0){printf("#") ; hist=hist-5} ; printf("\n")}}'
2014-08-19 14:43:20
User: higuita
Functions: awk du sort
Tags: awk
0

This command makes a small graph with the histogram of size blocks (5MB in this example), not individual files. Fine tune the 4+5*int($1/5) block for your own size jumps : jump-1+jump*($1/jump)

Also in the hist=hist-5 part, tune for bigger or smaller graphs

YEAR=2015; echo Jul $(ncal 7 $YEAR | awk '/^Fr/{print $NF}')
2014-08-17 11:12:09
User: andreasS
Functions: awk echo
Tags: awk date
0

Calculate the date of Sysadmin day (last Friday of July) of any given year

awp () { awk '{print $'$1'}'; }
[ `curl 'http://crl.godaddy.com/gds5-16.crl' 2>/dev/null | openssl crl -inform DER -noout -nextupdate | awk -F= '{print $2}' | xargs -I{} date -d {} +%s` -gt `date -d '8 hours' +%s` ] && echo "OK" || echo "Expires soon"
2014-08-07 17:18:38
User: hufman
Functions: awk date echo xargs
Tags: openssl
0

Downloads a CRL file, determines the expiration time, and checks when it will expire

yum list installed| awk '{print $1}'| grep -e "x86" -e "noarch" | grep -v -e '^@'| sort
2014-08-06 23:13:24
Functions: awk grep
0

Great for moves, re-installs etc since it is not version specific yet is architecture specific.

Centos yum list is well know for wrapping lines .

sudo dpkg -P $(dpkg -l yourPkgName* | awk '$2 ~ /yourPkgName.*/ && $1 ~ /.i/ {print $2}')
2014-08-06 22:40:32
User: wejn
Functions: awk sudo
Tags: dpkg purge
0

Recently in Debian Wheezy the dpkg command refuses to work with wildcards, so this is the one-liner alternative. (alternative to #13614)

sudo dpkg -P $(sudo dpkg -l yourPkgName* | awk '$2 ~ /yourPkgName.*/' | awk '$1 ~ /.i/' | awk '{print $2}')
2014-08-02 18:14:02
User: woohoo
Functions: awk sudo
Tags: dpkg purge
0

Recently in Debian Wheezy the dpkg command refuses to work with wildcards, so this is the one-liner alternative.

netstat -tn 2>/dev/null | grep :80 | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -nr | head
awk -F: '{print $2}' access_log | sort | uniq -c
system_profiler SPHardwareDataType | awk '/UUID/ { print $3; }'
2014-07-25 06:54:40
Functions: awk
0

Gets the Hardware UUID of the current machine using system_profiler.

ifconfig eth0 | grep inet | awk '{ print $2 }'
2014-07-23 20:43:15
User: smorg
Functions: awk grep ifconfig
Tags: centos
0

I just use this to see my ip on the server I'm working on

grep -r "<script" | grep -v src | awk -F: '{print $1}' | uniq
2014-07-23 06:24:31
User: sucotronic
Functions: awk grep
Tags: PHP javascript
2

Useful to crawl where the javascript is declared, and extract it a common file. You can redirect it to a file to review item by item.

cat h.txt| while read line; do curl -s -X POST 'https://www.virustotal.com/vtapi/v2/file/report' --form apikey="APIKEY" --form resource="$line"|awk -F'positives\":' '{print "VTHits"$2}'|awk -F' ' '{print $1" "$2$5$6}'|sed 's/["}]//g' && sleep 15; done
awk '/text to grep/{print \$1}' logs... | sort -n | uniq -c | sort -rn | head -n 100
2014-07-10 20:36:02
User: impinball
Functions: awk head sort uniq
Tags: Linux sh
0

Accepts multiple files via logs.... Substitute "text to grep" for your search string.

If you want to alias this, you could do something like this:

alias parse-logs='awk "/$1/{print \$1}" ${@[@]:1} | sort -n | uniq -c | sort -rn | head -n 100'