Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 293 results
find . -maxdepth 1 -type f -not -iname '*.jpg' -ls |awk '{TOTAL+=$7} END {print int(TOTAL/(1024^2))"MB"}'
2011-04-26 18:18:37
User: mack
Functions: awk find
Tags: awk find filesize
1

With this sentence we can estimate the storage size of all files not named *.jpg on the current directory.

The syntax is based on Linux, for Unix compliance use:

find ./* -prune ! -name '*.jpg' -ls |awk '{TOTAL+=$7} END {print int(TOTAL/(1024^2))"MB"}'

We can change the jpg extension for whatever extension what we need

sudo aptitude remove -P $(dpkg -l|awk '/^ii linux-image-2/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic\nlinux-image-%s-generic\n",$0,$0,$0)}')
2011-04-25 05:19:57
User: Bonster
Functions: awk sed sudo
-1

Same as 7272 but that one was too dangerous

so i added -P to prompt users to continue or cancel

Note the double space: "...^ii␣␣linux-image-2..."

Like 5813, but fixes two bugs: [1]This leaves the meta-packages 'linux-headers-generic' and 'linux-image-generic' alone so that automatic upgrades work correctly in the future. [2]Kernels newer than the currently running one are left alone (this can happen if you didn't reboot after installing a new kernel).

VAR="foo" ; awk '{ print '"$VAR"' }'
2011-04-15 07:56:20
User: FRUiT
Functions: awk
Tags: bash awk print
-2

BASH: Print shell variable into AWK

MyVAR=86; awk -v n=$MyVAR '{print n}'
MyVAR=85 awk '{ print ENVIRON["MyVAR"] }'
2011-04-14 16:46:23
User: depesz
Functions: awk
Tags: bash awk print
3

Alternatively:

export MyVAR=84; awk '{ print ENVIRON["MyVAR"] }'

MyVAR=84; awk '{ print "'"$MyVAR"'" }'
svn diff -r 1792:HEAD --summarize | awk '{if ($1 != "D") print $2}'| xargs -I {} tar rf incremental_release.tar {}
2011-04-05 15:00:49
User: windfold
Functions: awk diff tar xargs
Tags: bash svn awk xargs tar
0

The result of this command is a tar with all files that have been modified/added since revision 1792 until HEAD. This command is super useful for incremental releases.

ps -C apache o pid= | sed 's/^/-p /' | xargs strace
ps auxw | grep sbin/apache | awk '{print"-p " $2}' | xargs strace
2011-03-14 21:45:22
User: px
Functions: awk grep ps xargs
5

This one-liner will use strace to attach to all of the currently running apache processes output and piped from the initial "ps auxw" command into some awk.

ls -1 | awk 'BEGIN{srand()} {x[NR] = $0} END{print "Selected", x[1 + int(rand() * NR)]}'
2011-03-13 20:05:06
User: saibbot
Functions: awk ls
Tags: awk random
-3

I use this command to select a random movie from my movie collection..

ls -alt /directory/ | awk '{ print $6 " " $7 " -- " $9 }'
lsof | awk '/*:https?/{print $2}' | sort -u
2011-02-04 01:37:17
User: sugitaro
Functions: awk sort
Tags: sort awk lsof
-1

% lsof -v

lsof version information:

revision: 4.78

awk ' { printf ("%s ", $0)} END {printf ("\n") } ' FILE
2011-02-02 11:51:41
User: bouktin
Functions: awk printf
Tags: awk
-1

remove all carriage return of a given file (or input, if used with | ) and replace them with a space (or whatever character is after %s)

awk '{printf("/* %02d */ %s\n", NR,$0)}' inputfile > outputfile
2011-01-04 19:13:55
User: lucasrangit
Functions: awk
1

I often find the need to number enumerations and other lists when programming. With this command, create a new file called 'inputfile' with the text you want to number. Paste the contents of 'outputfile' back into your source file and fix the tabbing if necessary. You can also change this to output hex numbering by changing the "%02d" to "%02x". If you need to start at 0 replace "NR" with "NR-1". I adapted this from http://osxdaily.com/2010/05/20/easily-add-line-numbers-to-a-text-file/.

{ if (/^[A-Za-z0-9]/) { interface=$1; next } else { if (/inet [Aa][d]*r/) { split($2,ip,":") } else { next } } print interface"\t: "ip[2] }
awk 'BEGIN{RS="\0"}{gsub(/\n/,"<SOMETEXT>");print}' file.txt
2010-12-12 21:43:22
User: __
Functions: awk
1

awk version of 7210. Slightly longer, but expanding it to catch blank lines is easier:

awk 'BEGIN{RS="\0"}{gsub(/\n+/,"<SOMETEXT>");print}' file.txt
aptitude remove $(dpkg -l|awk '/^ii linux-image-2/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic\nlinux-image-%s-generic\n",$0,$0,$0)}')
2010-12-11 11:38:15
User: __
Functions: awk sed
7

Note the double space: "...^ii␣␣linux-image-2..."

Like 5813, but fixes two bugs: [1]This leaves the meta-packages 'linux-headers-generic' and 'linux-image-generic' alone so that automatic upgrades work correctly in the future. [2]Kernels newer than the currently running one are left alone (this can happen if you didn't reboot after installing a new kernel).

I'm bummed that this took 228 characters. I'd like to see a simpler version.

gawk '{n=$1;a=0;b=1;c=1;for(i=1;i<n;i++){c=a+b;a=b;b=c};print c}' << eof
2010-11-26 08:36:30
Functions: gawk
Tags: awk
-5

only take the first field on each row to compute the fibo on this number

tail -f file | awk '{now=strftime("%F %T%z\t");sub(/^/, now);print}'
awk 'func f(n){return(n<2?n:f(n-1)+f(n-2))}BEGIN{while(a<24){print f(a++)}}'
awk -F, '{print $1" "$2" "$NF}' foo.txt
awk 'BEGIN {FS=","} { print $1 " " $2 " " $NF}' foo.txt
2010-11-12 15:26:04
User: EBAH
Functions: awk
-1

Set field separator char from command line.

Prints first, second and lsat columns.

lynx -dump http://www.domain.com | awk '/http/{print $2}' | egrep "^https{0,1}"
google docs list |awk 'BEGIN { FS = "," }; {print "\""$1"\""}'|sed s/^/google\ docs\ get\ /|awk ' {print $0,"."}'
2010-10-26 21:00:30
Functions: awk sed
Tags: sed awk googlecl
1

Create commands to download all of your Google docs to the current directory.

google picasa list-albums |awk 'BEGIN { FS = "," }; {print "\""$1"\""}'|sed s/^/google\ picasa\ get\ /|awk ' {print $0,"."}'
2010-10-26 08:35:41
Functions: awk sed
Tags: sed awk googlecl
2

Create commands to download all of your Picasaweb albums

Install Googlecl (http://code.google.com/p/googlecl/) and authenticate first.