What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 303 results
awk 'BEGIN{RS="\0"}{gsub(/\n/,"<SOMETEXT>");print}' file.txt
2010-12-12 21:43:22
User: __
Functions: awk

awk version of 7210. Slightly longer, but expanding it to catch blank lines is easier:

awk 'BEGIN{RS="\0"}{gsub(/\n+/,"<SOMETEXT>");print}' file.txt
aptitude remove $(dpkg -l|awk '/^ii linux-image-2/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic\nlinux-image-%s-generic\n",$0,$0,$0)}')
2010-12-11 11:38:15
User: __
Functions: awk sed

Note the double space: "...^ii␣␣linux-image-2..."

Like 5813, but fixes two bugs: [1]This leaves the meta-packages 'linux-headers-generic' and 'linux-image-generic' alone so that automatic upgrades work correctly in the future. [2]Kernels newer than the currently running one are left alone (this can happen if you didn't reboot after installing a new kernel).

I'm bummed that this took 228 characters. I'd like to see a simpler version.

gawk '{n=$1;a=0;b=1;c=1;for(i=1;i<n;i++){c=a+b;a=b;b=c};print c}' << eof
2010-11-26 08:36:30
Functions: gawk
Tags: awk

only take the first field on each row to compute the fibo on this number

tail -f file | awk '{now=strftime("%F %T%z\t");sub(/^/, now);print}'
awk 'func f(n){return(n<2?n:f(n-1)+f(n-2))}BEGIN{while(a<24){print f(a++)}}'
awk -F, '{print $1" "$2" "$NF}' foo.txt
awk 'BEGIN {FS=","} { print $1 " " $2 " " $NF}' foo.txt
2010-11-12 15:26:04
User: EBAH
Functions: awk

Set field separator char from command line.

Prints first, second and lsat columns.

lynx -dump http://www.domain.com | awk '/http/{print $2}' | egrep "^https{0,1}"
google docs list |awk 'BEGIN { FS = "," }; {print "\""$1"\""}'|sed s/^/google\ docs\ get\ /|awk ' {print $0,"."}'
2010-10-26 21:00:30
Functions: awk sed
Tags: sed awk googlecl

Create commands to download all of your Google docs to the current directory.

google picasa list-albums |awk 'BEGIN { FS = "," }; {print "\""$1"\""}'|sed s/^/google\ picasa\ get\ /|awk ' {print $0,"."}'
2010-10-26 08:35:41
Functions: awk sed
Tags: sed awk googlecl

Create commands to download all of your Picasaweb albums

Install Googlecl (http://code.google.com/p/googlecl/) and authenticate first.

find -regextype posix-egrep -regex ".*/[A-Z]{3}_201009[0-9]{2}.*" -printf "%f %s\n" | awk '{ SUM += $2;COUNT++ } END { print SUM/1024 " kb in " COUNT " files" }'
lynx -dump http://www.domain.com | awk '/http/{print $2}'
man $(/bin/ls /bin | awk '{ cmd[i++] = $0 } END { srand(); print cmd[int(rand()*length(cmd))]; }')
2010-08-20 17:31:02
User: emilsit
Functions: awk man
Tags: man awk

Build an awk array with all commands and then select a random one at the end.

This avoids spawning extra processes for counting with wc or generating random numbers.

Explicitly call /bin/ls to avoid interactions with aliases.

dir="/bin"; man $(ls $dir |sed -n "$(echo $(( $RANDOM % $(ls $dir |wc -l | awk "{ print $1; }" ) + 1 )) )p")
2010-08-20 16:31:50
User: camocrazed
Functions: dir ls man sed
Tags: man sed awk echo wc

Broaden your knowledge of the utilities available to you in no particular order whatsoever! Then use that knowledge to create more nifty one-liners that you can post here. =p

Takes a random number modulo the number of files in $dir, prints the filename corresponding to that number, and passes it as an argument to man.

svn status | grep "^\?" | awk '{print $2}' | xargs svn add
2010-08-14 18:56:15
User: kureikain
Functions: awk grep xargs
Tags: svn awk grep

When working on a big proeject with SVN, you create quite much files, for now! Can just sit here and type svn add for all of them!

svn status will return a list of all of file which get ?(not add), "M"(Modified), "D"(Deleted)! This code just grep "?" flag, then add it into SVN again!

seq 8 | awk '{print "e(" $0 ")" }' | bc -l
2010-08-14 02:52:39
User: polar
Functions: awk bc seq
Tags: awk seq bc

If you want a sequence that can be plotted, do:

seq 8 | awk '{print "e(" $0 ")" }' | bc -l | awk '{print NR " " $0}'

Other bc functions include s (sine), c (cosine), l (log) and j (bessel). See the man page for details.

curl -sL 'www.commandlinefu.com/commands/random' | awk -F'</?[^>]+>' '/"command"/{print $2}'
2010-08-13 11:42:42
User: putnamhill
Functions: awk
Tags: awk curl random

Splitting on tags in awk is a handy way to parse html.

svn st | grep -e '^M' | awk '{print $2}' | xargs svn revert
ip route | awk '/default/{print $3}'
netstat -rn | awk '/UG/{print $2}'
2010-08-09 15:56:02
User: putnamhill
Functions: awk netstat

Tested on CentOS, Ubuntu, and MacOS.

lsof /dev/snd/pcm*p /dev/dsp | awk ' { print $2 }' | xargs kill
2010-07-23 20:24:16
User: alustenberg
Functions: awk xargs

for when a program is hogging the sound output. finds, and kills. add -9 to the end for wedged processes. add in 'grep ^program' after lsof to filter.

awk '$1~/^DocumentRoot/{print $2}' /etc/apache2/sites-available/default
awk 'NR==linenumber' filename
curl -s "http://www.socrata.com/api/views/vedg-c5sb/rows.json?search=Axelrod" | grep "data\" :" | awk '{ print $17 }'
2010-07-01 23:54:54
User: mheadd
Functions: awk grep
Tags: awk grep curl

Query the Socrata Open Data API being used by the White House to find any employee's salary using curl, grep and awk.

Change the value of the search parameter (example uses Axelrod) to the name of any White House staffer to see their annual salary.

ifconfig eth0 | awk '/inet / {print $2}' | cut -d ':' -f2