Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,118 results
ls -l|awk '{print $6,$8}'|sort -d
2009-03-13 19:00:18
User: archlich
Functions: awk ls sort
-4

Can pipe to tail or change the awk for for file size, groups, users, etc.

lsof|grep /somemount/| awk '{print $2}'|xargs kill
2009-03-12 18:42:19
User: archlich
Functions: awk grep xargs
4

This command will kill all processes using a directory. It's quick and dirty. One may also use a -9 with kill in case regular kill doesn't work. This is useful if one needs to umount a directory.

svn status | grep "^\?" | awk '{print $2}' | xargs svn add
2009-03-12 15:06:12
User: unixfu73000
Functions: awk grep xargs
Tags: svn
-1

This adds all new files to SVN recursively. It doesn't work for files that have spaces in their name, but why would you create a file with a space in its name in the first place?

svn status | grep ^? | awk '{print $2}' | xargs rm -rf
2009-03-10 17:01:40
User: Highwayman
Functions: awk grep rm xargs
1

Removes all unversioned files and folders from an svn repository. Also:

svn status --no-ignore | grep ^I | awk '{print $2}' | xargs rm -rf

will remove those files which svn status ignores. Handy to add to a script which is in your path so you can run it from any repository (a la 'svn_clean.sh').

grep Mar/2009 /var/log/apache2/access.log | awk '{ print $1 }' | sort -n | uniq -c | sort -rn | head
echo -n $mypass | md5sum | awk {'print $1'}
2009-03-10 13:12:21
User: tororebelde
Functions: awk echo md5sum
1

This was useful to generate random passwords to some webpage users, using the sample code, inside a bash script

awk '{ for (f = 1; f <= NF; f++) a[NR, f] = $f } NF > nf { nf = NF } END { for (f = 1; f <= nf; f++) for (r = 1; r <= NR; r++) printf a[r, f] (r==NR ? RS : FS) }'
2009-03-10 05:35:22
User: MSBF
Functions: awk printf
0

works the same as R's t()

sudo cat /proc/kcore | strings | awk 'length > 20' | less
2009-03-09 02:19:47
User: nesquick
Functions: awk cat strings sudo
Tags: cat ram strings
15

This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc).

Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here!

The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable").

If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...).

Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)...

sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})'

(depending on sudo settings it might be best to run

sudo su

first to get to a # prompt)

cat $(ls -tr | tail -1) | awk '{ a[$1] += 1; } END { for(i in a) printf("%d, %s\n", a[i], i ); }' | sort -n | tail -25
2009-03-06 17:50:29
User: oremj
Functions: awk cat ls sort tail
7

This command is much quicker than the alternative of "sort | uniq -c | sort -n".

touch /tmp/$$;for N in `seq -w 0 7777|grep -v [89]`; do chmod $N /tmp/$$; P=`ls -l /tmp/$$ | awk '{print $1}'`; echo $N $P; done;rm /tmp/$$
grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}' OR USE ALTERNATE WAY awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'
2009-03-05 07:28:26
User: rommelsharma
Functions: awk grep
-3

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}'

OR

awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'

In this example, we are having a text file that is having several entries like:

---

c1 c2 c3 c4

this is some data

HOME /dir1/dir2/.../dirN/somefile1.xml

HOME /dir1/dir2/somefile2.xml

some more data

---

for lines starting with HOME, we are extracting the second field that is a 'file path with file name', and from that we need to get the filename only and ignore the slash delimited path.

The output would be:

somefile1.xml

somefile2.xml

(In case you give a -ive - pls give the reasons as well and enlighten the souls :-) )

zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
2009-03-03 13:45:56
User: dbart
Functions: awk sort uniq zgrep
8

This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system.

gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less
awk -F "=| "
2009-03-02 21:09:51
User: Bender
Functions: awk cat file
9

You can use multiple field separators by separating them with | (=or).

This may be helpful when you want to split a string by two separators for example.

#echo "one=two three" | awk -F "=| " {'print $1, $3'}

one three

ps axww | grep SomeCommand | awk '{ print $1 }' | xargs kill
2009-02-28 17:48:51
User: philiph
Functions: awk grep ps xargs
-7

This command kills all processes with 'SomeCommand' in the process name. There are other more elegant ways to extract the process names from ps but they are hard to remember and not portable across platforms. Use this command with caution as you could accidentally kill other matching processes!

xargs is particularly handy in this case because it makes it easy to feed the process IDs to kill and it also ensures that you don't try to feed too many PIDs to kill at once and overflow the command-line buffer.

Note that if you are attempting to kill many thousands of runaway processes at once you should use 'kill -9'. Otherwise the system will try to bring each process into memory before killing it and you could run out of memory. Typically when you want to kill many processes at once it is because you are already in a low memory situation so if you don't 'kill -9' you will make things worse

ifconfig | awk '/inet / {print $2}'
2009-02-27 17:05:08
User: haivu
Functions: awk ifconfig
1

On the Mac, the format ifconfig puts out is little different from Linux: the IP address is space separated, instead of colon. That makes parsing the IP address easier. See releated command for Linux/Unix:

http://www.commandlinefu.com/commands/view/651/getting-the-ip-address-of-eth0

grep "FOUND" /var/log/squidclamav.log | awk '{print $5"-"$2"-"$3","$4","$11}' | sed -e 's/\,http.*url=/\,/g' | sed -e 's/&/\,/g' | sed -e 's/source=//g' |sed -e 's/user=//g' | sed -e 's/virus=//g' | sed -e 's/stream\:+//g' | sed -e 's/\+FOUND//g'
2009-02-27 13:28:18
User: nablas
Functions: awk grep sed
0

This command will list a CSV list of infected files detected by clamav through squidclamav redirector.

awk 'BEGIN {FS=","} {loc = $4, val=$5; getline < "f0001ch1.csv"; print loc,val,$5}' f0001ch2.csv > data
2009-02-27 13:01:16
User: Masse
Functions: awk
0

Parses tektronic given csv files for both channel 1 and channel 2 and joins them together. Can be easily used by gnuplot after that.

IPADDR=`ifconfig eth0 | grep -i inet | awk -F: '{print $2}'| awk '{print $1}'`
2009-02-25 22:58:19
User: rockon
Functions: awk grep
0

Useful in scripts while you just need an IP address in a variable.

du --max-depth=1 | sort -r -n | awk '{split("k m g",v); s=1; while($1>1024){$1/=1024; s++} print int($1)" "v[s]"\t"$2}'
2009-02-24 11:03:08
User: hans
Functions: awk du sort
16

I use this on debian testing, works like the other sorted du variants, but i like small numbers and suffixes :)

for x in `psql -e\l | awk '{print $1}'| egrep -v "(^List|^Name|\-\-\-\-\-|^\()"`; do pg_dump -C $x | gzip > /var/lib/pgsql/backups/$x-nightly.dmp.gz; done
2009-02-21 15:21:09
User: f4nt
Functions: awk egrep gzip
1

Ran as the postgres user, dumps each database individually. It dumps with the create statements as well, so you can just 'zcat $x-nightly.dmp.gz | psql' to reimport/recreate a database from a backup.

N="filepath" ; P=/proc/$(lsof +L1 | grep "$N" | awk '{print $2}')/fd ; ls -l $P | sed -rn "/$N/s/.*([0-9]+) ->.*/\1/p" | xargs -I_ cat $P/_ > "$N"
2009-02-21 02:31:24
User: laburu
Functions: awk cat grep ls sed xargs
5

Note that the file at the given path will have the contents of the (still) deleted file, but it is a new file with a new node number; in other words, this restores the data, but it does not actually "undelete" the old file.

I posted a function declaration encapsulating this functionality to http://www.reddit.com/r/programming/comments/7yx6f/how_to_undelete_any_open_deleted_file_in_linux/c07sqwe (please excuse the crap formatting).

for i in `ps aux | grep ssh | grep -v grep | awk {'print $2'}` ; do kill $i; done
ps auxwww | grep outofcontrolprocess | awk '{print $9}' | xargs kill -9
ps aux | awk '/name/ {print $2}'
2009-02-20 21:35:52
User: evil_otto
Functions: awk ps
-5

This finds a process id by name, but without the extra grep that you usually see. Remember, awk can grep too!