Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,116 results
svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
2009-06-05 14:07:28
User: jemptymethod
Functions: awk sort
Tags: svn awk log
4

just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module

awk '/match/{print NR}' file
for i in `du --max-depth=1 $HOME | sort -n -r | awk '{print $1 ":" $2}'`; do size=`echo $i | awk -F: '{print $1}'`; dir=`echo $i | awk -F: '{print $NF}'`; size2=$(($size/1024)); echo "$size2 MB used by $dir"; done | head -n 10
cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
2009-06-02 15:51:51
User: lrvick
Functions: awk cat
0

Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob?

It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt.

I ran into this today in that my only copy of a CSS file was "compressed" with no newlines.

I whipped this up and it converted back into nice human readable CSS :-)

It could be nicer, but it does the job.

while [ i != 0 ]; do sleep 1 | dialog --clear --gauge "Quality: " 0 0 $(cat /proc/net/wireless | grep $WIRELESSINTERFACE | awk '{print $3}' | tr -d "."); done
2009-05-31 16:09:23
User: ncaio
Functions: awk cat grep sleep tr
1

The variable WIRELESSINTERFACE indicates your wireless interface

pdfinfo Virtualization_A_Beginner_Guide.pdf | awk /Pages/
mkdir -p /cdrom/unnamed_cdrom ; mount -F hsfs -o ro `ls -al /dev/sr* |awk '{print "/dev/" $11}'` /cdrom/unnamed_cdrom
2009-05-31 08:42:20
User: felix001
Functions: awk mkdir mount
-2

This will allow you to mount a CD-ROM on Solaris SPARC 9 or lower. This will not work on Solaris 10 due to void and the volume management daemons.

www.fir3net.com

for i in `svn status | egrep '^(M|A)' | sed -r 's/\+\s+//' | awk '{ print $2 }'` ; do if [ ! -d $i ] ; then php -l $i ; fi ; done
2009-05-29 23:59:28
Functions: awk egrep sed
Tags: svn Linux PHP
0

Really only valuable in a PHP-only project directory. This is using standard linux versions of the tools. On most older BSD variants of sed, use -E instead of -r. Or use: sed 's/\+[[:space:]]\{1,\}//' instead.

awk 'BEGIN{srand()}{print rand(),$0}' SOMEFILE | sort -n | cut -d ' ' -f2-
2009-05-29 01:20:50
User: axelabs
Functions: awk cut sort
Tags: sort awk random
4

This appends a random number as a first filed of all lines in SOMEFILE then sorts by the first column and finally cuts of the random numbers.

awk 'BEGIN{size=5} {mod=NR%size; if(NR<=size){count++}else{sum-=array[mod]};sum+=$1;array[mod]=$1;print sum/count}' file.dat
2009-05-29 00:07:24
User: mungewell
Functions: awk
3

Sometimes jittery data hides trends, performing a rolling average can give a clearer view.

mysql -s -e "show processlist" |awk '{print $1}'
svn log fileName|cut -d" " -f 1|grep -e "^r[0-9]\{1,\}$"|awk {'sub(/^r/,"",$1);print "svn cat fileName@"$1" > /tmp/fileName.r"$1'}|sh
2009-05-27 02:11:58
User: fizz
Functions: awk cut grep
Tags: bash svn awk grep
2

exported files will get a .r23 extension (where 23 is the revision number)

lsof -nP +p 24073 | grep -i listen | awk '{print $1,$2,$7,$8,$9}'
kill_daemon() { echo "Daemon?"; read dm; kill -15 $(netstat -atulpe | grep $dm | cut -d '/' -f1 | awk '{print $9}') }; alias kd='kill_daemon
2009-05-26 20:39:56
User: P17
-5

Just find out the daemon with $ netstat -atulpe. Then type in his name and he gets the SIGTERM.

ip route show dev eth0 | awk '{print $7}'
2009-05-26 20:29:54
User: P17
Functions: awk route
Tags: IP
2
ip address show | grep eth0 | sed '1d' | awk '{print $2}'

does the same, but shows network-prefix.

netstat -ntlp | grep -w 80 | awk '{print $7}' | cut -d/ -f1
function autoCompleteHostname() { local hosts; local cur; hosts=($(awk '{print $1}' ~/.ssh/known_hosts | cut -d, -f1)); cur=${COMP_WORDS[COMP_CWORD]}; COMPREPLY=($(compgen -W '${hosts[@]}' -- $cur )) } complete -F autoCompleteHostname ssh
2009-05-17 23:12:34
User: sbisordi
Functions: awk cut
6

This is meant for the bash shell. Put this function in your .profile and you'll be able to use tab-completion for sshing any host which is in your known_hosts file. This assumes that your known_hosts file is located at ~/.ssh/known_hosts. The "complete" command should go on a separate line as such:

function autoCompleteHostname() {

local hosts=($(awk '{print $1}' ~/.ssh/known_hosts | cut -d, -f1));

local cur=${COMP_WORDS[COMP_CWORD]};

COMPREPLY=($(compgen -W '${hosts[@]}' -- $cur ))

}

complete -F autoCompleteHostname ssh

find . -name \*.mp3 -printf "%C+ %h/%f\n" | sort -r | head -n20 | awk '{print "\""$2"\""}' | xargs -I {} cp {} ~/tmp
2009-05-17 07:06:10
User: bkinsey
Functions: awk cp find head sort xargs
2

Change ~/tmp to the destination directory, such as your mounted media. Change -n20 to whatever number of files to copy. It should quit when media is full. I use this to put my most recently downloaded podcasts onto my phone.

svn st | grep '^\?' | awk '{print $2}' | xargs svn add; svn st | grep '^\!' | awk '{print $2}' | xargs svn rm
2009-05-14 14:34:50
User: stedwick
Functions: awk grep xargs
0

automatically add and remove files in subversion so that you don't have to do it through the annoying svn commands anymore

chkconfig --list | fgrep :on | sed -e 's/\(^.*\)*0:off/\1:/g' -e 's/\(.\):on/\1/g' -e 's/.:off//g' | tr -d [:blank:] | awk -F: '{print$2,$1}' | ssh host 'cat > foo'
2009-05-13 21:17:39
User: catawampus
2

And then to complete the task:

Go to target host;

ssh host

Turn everything off:

for i in `chkconfig --list | fgrep :on | awk '{print $1}'` ; do chkconfig --level 12345 $i off; done

Create duplicate config:

while read line; do chkconfig --level $line on; done < foo
du -sb *|sort -nr|head|awk '{print $2}'|xargs du -sh
ip route show dev ppp0 | awk '{ print $7 }'
awk -F\" '{print $4}' *.log | grep -v "eviljaymz\|\-" | sort | uniq -c | awk -F\ '{ if($1>500) print $1,$2;}' | sort -n
2009-05-05 22:21:04
User: jaymzcd
Functions: awk grep sort uniq
1

This prints a summary of your referers from your logs as long as they occurred a certain number of times (in this case 500). The grep command excludes the terms, I add this in to remove results Im not interested in.

Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
2009-05-05 21:51:16
User: jaymzcd
Functions: awk egrep wc
0

I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter).

For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches).

ps -eo user,pcpu,pmem | tail -n +2 | awk '{num[$1]++; cpu[$1] += $2; mem[$1] += $3} END{printf("NPROC\tUSER\tCPU\tMEM\n"); for (user in cpu) printf("%d\t%s\t%.2f%\t%.2f%\n",num[user], user, cpu[user], mem[user]) }'