What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands using awk from sorted by
Terminal - Commands using awk - 1,203 results
ps ax | grep <processname> | grep -v grep | awk '{print $1}' | sudo xargs kill -9
/usr/sbin/arp -i eth0 | awk '{print $3}' | sed 1d
ps aux | grep 'httpd ' | awk {'print $2'} | xargs kill -9
mysql --database=dbname -B -N -e "SHOW TABLES" | awk '{print "ALTER TABLE", $1, "CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;"}' | mysql --database=dbname &
2009-03-21 18:45:15
User: root
Functions: awk
Tags: mysql

This loops through all tables and changes their collations to UTF8. You should backup beforehand though in case some data is lost in the process.

sudo zcat /var/log/auth.log.*.gz | awk '/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a["*" $11]++}END{for (i in a) printf "%6s\t%s\n", a[i], i|"sort -n"}'
2009-03-21 06:41:59
Functions: awk printf sudo zcat

Show the number of failed tries of login per account. If the user does not exist it is marked with *.

ropened='p4 opened | awk -F# "{print \$1}" | p4 -x - revert'
alias opened='p4 opened | awk -F# "{print \$1}"'
2009-03-20 11:06:41
User: Alexander
Functions: alias awk
Tags: p4 SCM Perforce

Just type 'opened' and get all files currently opened for edit.

vos listvldb | agrep LOCKED -d RWrite | grep RWrite: | awk -F: '{print $2}' | awk '{printf("%s ",$1)} END {printf("\n")}'
2009-03-17 19:55:39
User: mpb
Functions: awk grep

This command shows if there are any locked AFS volumes.

The output is a list of AFS volume IDs (or nothing if there are none locked).

cat count.txt | awk '{ sum+=$1} END {print sum}'
2009-03-16 00:22:13
User: duxklr
Functions: awk cat
Tags: awk

Takes a input file (count.txt) that looks like:






It will add/sum the first column of numbers.

ls -1 | grep " " | awk '{printf("mv \"%s\" ",$0); gsub(/ /,"_",$0); printf("%s\n",$0)}' | sh # rename filenames: spaces to "_"
2009-03-15 18:42:43
User: mpb
Functions: awk grep ls rename sh

This command converts filenames with embedded spaces in the current directory replacing spaces with the underscore ("_") character.

ls -l|awk '{print $6,$8}'|sort -d
2009-03-13 19:00:18
User: archlich
Functions: awk ls sort

Can pipe to tail or change the awk for for file size, groups, users, etc.

lsof|grep /somemount/| awk '{print $2}'|xargs kill
2009-03-12 18:42:19
User: archlich
Functions: awk grep xargs

This command will kill all processes using a directory. It's quick and dirty. One may also use a -9 with kill in case regular kill doesn't work. This is useful if one needs to umount a directory.

svn status | grep "^\?" | awk '{print $2}' | xargs svn add
2009-03-12 15:06:12
User: unixfu73000
Functions: awk grep xargs
Tags: svn

This adds all new files to SVN recursively. It doesn't work for files that have spaces in their name, but why would you create a file with a space in its name in the first place?

svn status | grep ^? | awk '{print $2}' | xargs rm -rf
2009-03-10 17:01:40
User: Highwayman
Functions: awk grep rm xargs

Removes all unversioned files and folders from an svn repository. Also:

svn status --no-ignore | grep ^I | awk '{print $2}' | xargs rm -rf

will remove those files which svn status ignores. Handy to add to a script which is in your path so you can run it from any repository (a la 'svn_clean.sh').

grep Mar/2009 /var/log/apache2/access.log | awk '{ print $1 }' | sort -n | uniq -c | sort -rn | head
echo -n $mypass | md5sum | awk {'print $1'}
2009-03-10 13:12:21
User: tororebelde
Functions: awk echo md5sum

This was useful to generate random passwords to some webpage users, using the sample code, inside a bash script

awk '{ for (f = 1; f <= NF; f++) a[NR, f] = $f } NF > nf { nf = NF } END { for (f = 1; f <= nf; f++) for (r = 1; r <= NR; r++) printf a[r, f] (r==NR ? RS : FS) }'
2009-03-10 05:35:22
User: MSBF
Functions: awk printf

works the same as R's t()

sudo cat /proc/kcore | strings | awk 'length > 20' | less
2009-03-09 02:19:47
User: nesquick
Functions: awk cat strings sudo
Tags: cat ram strings

This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc).

Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here!

The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable").

If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...).

Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)...

sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})'

(depending on sudo settings it might be best to run

sudo su

first to get to a # prompt)

cat $(ls -tr | tail -1) | awk '{ a[$1] += 1; } END { for(i in a) printf("%d, %s\n", a[i], i ); }' | sort -n | tail -25
2009-03-06 17:50:29
User: oremj
Functions: awk cat ls sort tail

This command is much quicker than the alternative of "sort | uniq -c | sort -n".

touch /tmp/$$;for N in `seq -w 0 7777|grep -v [89]`; do chmod $N /tmp/$$; P=`ls -l /tmp/$$ | awk '{print $1}'`; echo $N $P; done;rm /tmp/$$
grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}' OR USE ALTERNATE WAY awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'
2009-03-05 07:28:26
User: rommelsharma
Functions: awk grep

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}'


awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'

In this example, we are having a text file that is having several entries like:


c1 c2 c3 c4

this is some data

HOME /dir1/dir2/.../dirN/somefile1.xml

HOME /dir1/dir2/somefile2.xml

some more data


for lines starting with HOME, we are extracting the second field that is a 'file path with file name', and from that we need to get the filename only and ignore the slash delimited path.

The output would be:



(In case you give a -ive - pls give the reasons as well and enlighten the souls :-) )

zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
2009-03-03 13:45:56
User: dbart
Functions: awk sort uniq zgrep

This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system.

gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less
awk -F "=| "
2009-03-02 21:09:51
User: Bender
Functions: awk cat file

You can use multiple field separators by separating them with | (=or).

This may be helpful when you want to split a string by two separators for example.

#echo "one=two three" | awk -F "=| " {'print $1, $3'}

one three

ps axww | grep SomeCommand | awk '{ print $1 }' | xargs kill
2009-02-28 17:48:51
User: philiph
Functions: awk grep ps xargs

This command kills all processes with 'SomeCommand' in the process name. There are other more elegant ways to extract the process names from ps but they are hard to remember and not portable across platforms. Use this command with caution as you could accidentally kill other matching processes!

xargs is particularly handy in this case because it makes it easy to feed the process IDs to kill and it also ensures that you don't try to feed too many PIDs to kill at once and overflow the command-line buffer.

Note that if you are attempting to kill many thousands of runaway processes at once you should use 'kill -9'. Otherwise the system will try to bring each process into memory before killing it and you could run out of memory. Typically when you want to kill many processes at once it is because you are already in a low memory situation so if you don't 'kill -9' you will make things worse