Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,150 results
tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
awk -F: '{uid[$3]=1}END{for(x=500; x<=600; x++) {if(uid[x] != ""){}else{print x; exit;}}}' /etc/passwd
2010-05-10 18:10:51
User: UnixSage
Functions: awk
2

Typical usage would be in a script that would want the next open UID in a range (in this case 500-600)

changing_assets = `s3cmd sync --dry-run -P -M --exclude=*.php --delete-removed #{preprod_release_dir}/web/ #{s3_bucket} | grep -E 'delete:|upload:' | awk '{print $2}' | sed s_#{preprod_release_dir}/web__`
2010-05-07 16:03:42
User: trivoallan
Functions: awk grep sed sync
2

Can be useful to granulary flush files in a CDN after they've been changed in the S3 bucket.

netstat -an | awk '/tcp/ {print $6}' | sort | uniq -c
2010-05-06 17:04:37
User: Kered557
Functions: awk netstat sort uniq
1

Counts TCP states from Netstat and displays in an ordered list.

echo "Keyword?";read keyword;query="http://www.shoutcast.com/sbin/newxml.phtml?search="$keyword"";curl -s $query |awk -F '"' 'NR <= 4 {next}NR>15{exit}{sub(/SHOUTcast.com/,"http://yp.shoutcast.com/sbin/tunein-station.pls?id="$6)}{print i++" )"$2}'
2010-05-03 00:44:10
User: benyounes
Functions: awk echo
Tags: awk curl
4

Searches for web radio by submitted keyword and returns the station name and the link for listing .

May be enhanced to read user's selection and submit it to mplayer.

history | awk '{a[$'$(echo "1 2 $HISTTIMEFORMAT" | wc -w)']++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
2010-05-02 21:48:53
User: bandie91
Functions: awk echo sort wc
Tags: history awk wc
0

If you use HISTTIMEFORMAT environment e.g. timestamping typed commands, $(echo "1 2 $HISTTIMEFORMAT" | wc -w)

gives the number of columns that containing non-command parts per lines.

It should universify this command.

zenity --list --width 500 --height 500 --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer
2010-04-28 23:45:35
User: polaco
Functions: awk sort xargs
15

This is a very simple and lightweight way to play DI.FM stations

For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above)

zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer

This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start

dependencies:

- x11 with gtk environment

- zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu)

- mplayer: simple audio player (sudo apt-get install mplayer on ubuntu)

ps aux |awk '{$1} {++P[$1]} END {for(a in P) if (a !="USER") print a,P[a]}'
2010-04-28 15:25:18
User: benyounes
Functions: awk ps
5

enumerates the number of processes for each user.

ps BSD format is used here , for standard Unix format use : ps -eLf |awk '{$1} {++P[$1]} END {for(a in P) if (a !="UID") print a,P[a]}'

find -name `egrep -s '.' * | awk -F":" '{print $1}' | sort -u` -exec stat {} \;
2010-04-26 20:01:44
Functions: awk find sort stat
1

This will run stat on each file in the directory.

netstat -n | awk '/^tcp/ {++B[$NF]} END {for(a in B) print a, B[a]}'
find ./ -name *.h -exec egrep -cH "// | /\*" {} \; | awk -F':' '{print $2 ":" $1}' | sort -gr
2010-04-23 19:00:07
User: blocky
Functions: awk egrep find sort
1

This shows you which files are most in need of commenting (one line of output per file)

/sbin/ifconfig|grep -B 1 inet |head -1 | awk '{print $5}'
awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' | sed '/^$/d' > file.html
2010-04-20 13:27:47
User: tamouse
Functions: awk sed
Tags: Linux awk html
1

previous version leaves lots of blank lines

awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' file.html
2010-04-20 10:54:03
User: sata
Functions: awk
Tags: Linux awk html
4

Case Insensitive! and Works even if the "<title>...</title>" spans over multiple line.

Simple! :-)

find ~/.mozilla/firefox/*/Cache -exec file {} \; | awk -F ': ' 'tolower($2)~/mpeg/{print $1}'
2010-04-19 06:59:55
User: sata
Functions: awk file find
2

Grab a list of MP3s (with full path) out of Firefox's cache

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the *full path* of those tunes straight out of FF's cache in a clean list.

Shorter and Intuitive version of the command submitted by (TuxOtaku)

for i in emerg alert crit error warn ; do awk '$6 ~ /^\['$i'/ {print substr($0, index($0,$6)) }' error_log | sort | uniq -c | sort -n | tail -1; done
2010-04-15 21:47:18
User: zlemini
Functions: awk sort tail uniq
4

This searches the Apache error_log for each of the 5 most significant Apache error levels, if any are found the date is then cut from the output in order to sort then print the most common occurrence of each error.

ifconfig -a| awk '/^wlan|^eth|^lo/ {;a=$1;FS=":"; nextline=NR+1; next}{ if (NR==nextline) { split($2,b," ")}{ if ($2 ~ /[0-9]\./) {print a,b[1]}; FS=" "}}'
2010-04-15 04:34:28
User: alf
Functions: awk ifconfig
Tags: ifconfig awk IP
5

Interfaces like lo can be omitted from the beginning, there are probably better ways of doing this, i'm a noob at awk.

[ -n "$SSH_CLIENT" ] && export DISPLAY=$(echo $SSH_CLIENT | awk '{ print $1 }'):0.0
2010-04-14 08:19:37
User: GouNiNi
Functions: awk echo export
0

In some case, you need to use remote gui on servers or simple machines and it's boring to see "cannot open display on ..." if you forgot to export your display. Juste add this line in .bashrc on remote machine. Dont forget to allow remote client on your local X server :

xhost +
tail /var/log/emerge.log | awk -F: '{print strftime("%Y%m%d %X %Z", $1),$2}'
for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
2010-04-11 23:14:18
User: TuxOtaku
Functions: awk file grep sed
4

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

awk '{a[$1] += $2} END { for (i in a) {print i " " a[i]}}' /path/to/file
2010-04-11 02:28:09
User: italo
Functions: awk
1

Example:

cat <<EOF >/tmp/foo

5 45

80 35

4 10

80 15

80 10

4 10

5 20

EOF

awk '{a[$1] += $2} END { for (i in a) {print i "=" a[i]}}' /tmp/foo

80=60

4=20

5=65

ps aux|grep -i [p]rocessname|awk '{ print $2 }'|xargs kill
sudo awk '($9 ~ /404/)' /var/log/httpd/www.domain-access_log | awk '{print $2,$9,$7,$11}' | sort | uniq -c
2010-04-09 10:31:50
User: ninjasys
Functions: awk sort sudo uniq
Tags: log error apache
1

This command will return a full list of Error 404 pages in the given access log. The following variables have been given to awk

Hostname ($2), ERROR Code ($9), Missing Item ($7), Referrer ($11)

You can then send this into a file (>> /path/to/file), which you can open with OpenOffice as a CSV

awk '$9 == 404 {print $7}' access_log | uniq -c | sort -rn | head
2010-04-08 21:40:53
User: zlemini
Functions: awk sort uniq
8

Finds the top ten pages returning an http response code of 404 in an apache log.

iconv -f UTF16LE -t UTF-8 < SOURCE | awk 'BEGIN { RS="\r\n";} { gsub("\n", "\r"); print;}' > TARGET
2010-04-04 07:16:57
User: zhangweiwu
Functions: awk iconv
0

Sometimes you want to work on data sheets by using heirloom unic commands like cut, paste, sed, sort, wc and good old awk. But your user works on Microsoft Excel spreadsheet. The idea:

1) ask your user to save it as "Unicode Text" from Microsoft Excel and send the document to you;

2) use the given command to convert it to UTF-8 text. We carefully convert "\r\n" to local end-of-line character; and to convert "\n" (in Excel, means linebreak within the table cell") to "\r", which is carrier return but not end-of-line in Unix. If the "\n" is not replaced with "\r", for example, wc -l will report incorrect column number.