Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,118 results
top -b -n 1 | awk '{if (NR <=7) print; else if ($8 == "D") {print; count++} } END {print "Total status D: "count}'
lsof | awk '{print $1}' | sort | uniq -c | sort -rn | head
ps aux | grep -i firefox | grep -v grep | awk '{print $2}' | xargs -t -i kill -9 {}
2009-02-19 18:50:00
User: blackdude
Functions: awk grep kill ps xargs
-7

This is a nice way to kill processes.. the example here is for firefox!!! substitute firefox for whatever the process name is...

lsof -p $(netstat -ltpn|awk '$4 ~ /:80$/ {print substr($7,1,index($7,"/")-1)}')| awk '$9 ~ /access.log$/ {print $9| "sort -u"}'
2009-02-19 16:11:54
User: rjamestaylor
Functions: awk netstat
2

Ever logged into a *nix box and needed to know which webserver is running and where all the current access_log files are? Run this one liner to find out. Works for Apache or Lighttpd as long as CustomLog name is somewhat standard. HINT: works great as input into for loop, like this:

for i in `lsof -p $(netstat -ltpn|awk '$4 ~ /:80$/ {print substr($7,1,index($7,"/")-1)}')| awk '$9 ~ /access.log$/ {print $9| "sort -u"}'` ; do echo $i; done

Very useful for triage on unfamiliar servers!

svn st | grep ^\? | awk '{print $2}' | xargs svn add
netstat -alpn | grep :80 | awk '{print $4}' |awk -F: '{print $(NF-1)}' |sort | uniq -c | sort -n
du -sk * | awk '{print $1} END {print "[+z1<y]sy\nlyx\np"}' | dc
last | grep -v "^$" | awk '{ print $1 }' | sort -nr | uniq -c
2009-02-18 16:38:59
User: hkyeakley
Functions: awk grep last sort uniq
15

This command takes the output of the 'last' command, removes empty lines, gets just the first field ($USERNAME), sort the $USERNAMES in reverse order and then gives a summary count of unique matches.

printf %d 0x`dd if=/dev/urandom bs=1 count=4 2>/dev/null | od -x | awk 'NR==1 {print $2$3}'`
2009-02-18 16:23:09
User: introp
Functions: awk od printf
1

Sometimes, in a shell script, you need a random number bigger than the range of $RANDOM. This will print a random number made of four hex values extracted from /dev/urandom.

ps -ef | grep $USERNAME | awk {'print $2'} | xargs kill [-9]
2009-02-17 20:35:35
User: TheNomad
Functions: awk grep kill ps xargs
3

This is a 'killall' command equivalent where it is not available.

Prior to executing it, set the environment variable USERNAME to the username, whose processes you want to kill or replace the username with the $USERNAME on the command above.

Side effect: If any processes from other users, are running with a parameter of $USERNAME, they will be killed as well (assuming you are running this as root user)

[-9] in square brackets at the end of the command is optional and strongly suggested to be your last resort. I do not like to use it as the killed process leaves a lot of mess behind.

find . -size +10240k -exec ls -l {} \; | awk '{ print $5,"",$9 }'|sort -rn > message.out
2009-02-17 19:39:56
User: rommelsharma
Functions: awk find ls
5

This command specifies the size in Kilobytes using 'k' in the -size +(N)k option. The plus sign says greater than. -exec [cmd] {} \; invokes ls -l command on each file and awk strips off the values of the 5th (size) and the 9th (filename) column from the ls -l output to display. Sort is done in reversed order (descending) numerically using sort -rn options.

A cron job could be run to execute a script like this and alert the users if a dir has files exceeding certain size, and provide file details as well.

history | awk '{print $2}' | sort | uniq -c | sort -rn | head
awk -F'^"|", "|"$' '{ print $2,$3,$4 }' file.csv
2009-02-16 21:32:46
User: SiegeX
Functions: awk
7

The $2, $3, $4 fields are arbitrary but note that the first field starts from $2 and the last field is $NF-1. This is due to the fact that the leading and trailing quotes are treated as field delimiters.

newhostname=$(hostname | awk -F. '{print $1 "." $2}'); ipaddress=$(nslookup `hostname` | grep -i address | awk -F" " '{print $2}' | awk -F. '{print $3 "." $4}' | grep -v 64.142);PS1="[`id -un`.$newhostname.$ipaddress]"' (${PWD}): '; export PS1
2009-02-16 20:11:53
User: simardd
-4

changes the PS1 to something better than default.

[username.hostname.last-2-digits-of-ip] (current directory)

find path/to/folder/ -type f -print0 | xargs -0 -n 1 md5sum | awk '{print $1}' | sort | md5sum | awk '{print $1}'
2009-02-16 19:39:37
User: mcover
Functions: awk find md5sum sort xargs
-2

For quick validation of folder's file-contents (structure not taken into account) - I use it mostly to check if two folders' contents are the same.

df / | awk '{print $1}' | grep dev | xargs tune2fs -l | grep create
2009-02-16 18:45:03
User: Kaio
Functions: awk df grep tune2fs xargs
9

Very useful set of commands to know when your file system was created.

ip route | grep default | awk '{print $3}'
2009-02-16 16:29:03
User: ruedu
Functions: awk grep route
1

This gets you your default route programatically, useful for scripts

netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n | tail
2009-02-16 15:48:27
User: TuxOtaku
Functions: awk cut netstat sort uniq
2

This command does a tally of concurrent active connections from single IPs and prints out those IPs that have the most active concurrent connections. VERY useful in determining the source of a DoS or DDoS attack.

LC_ALL=C svn info | grep Revision | awk '{print $2}'
2009-02-16 14:53:52
Functions: awk grep info
-2

This is the simple revision number on stdout, that can be fed to any useful/fun script of yours. Setting LC_ALL is useful if you use another locale, in which case "Revision" is translated and cannot be found. I use this with doxygen to insert my source files revisions into the doc. An example in Doxyfile:

FILE_VERSION_FILTER = "function svn_filter { LC_ALL=C svn info $1 | grep Revision | awk '{print $2}'; }; svn_filter"

Share your ideas about what to do with the revision number !

find / -type f -size +25M -exec ls -lh {} \; | awk '{ print $5 " " $6$7 ": " $9 }'
2009-02-16 12:27:48
User: darkon1365
Functions: awk find ls
1

Very useful for finding all files over a specified size, such as out of control log files chewing up all available disk space. Fedora Core x specific version.

ps aux | awk '{sum+=$6} END {print sum/1024}'
for i in $(svn st | grep "?" | awk '{print $2}'); do svn add $i; done;
netstat -anl | grep :80 | awk '{print $5}' | cut -d ":" -f 1 | uniq -c | sort -n | grep -c IPHERE
2009-02-16 08:54:08
User: nullrouter
Functions: awk cut grep netstat sort uniq
3

This will tell you who has the most Apache connections by IP (replace IPHERE with the actual IP you wish to check). Or if you wish, remove | grep -c IPHERE for the full list.

netstat -pant 2> /dev/null | grep SYN_ | awk '{print $5;}' | cut -d: -f1 | sort | uniq -c | sort -n | tail -20
2009-02-16 08:49:38
3

List top 20 IP from which TCP connection is in SYN_RECV state.

Useful on web servers to detect a syn flood attack.

Replace SYN_ with ESTA to find established connections

xprop | awk '/PID/ {print $3}' | xargs ps h -o pid,cmd
2009-02-16 07:55:19
User: jackhab
Functions: awk ps xargs
9

This command is useful when you want to know what process is responsible for a certain GUI application and what command you need to issue to launch it in terminal.