Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 303 results
sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
2012-09-11 14:51:10
User: sgowie
Functions: awk lastb sort sudo
1

The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP.

The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count.

The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk.

Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it.

netstat -an | grep 80 | wc -l
a=$(xwininfo |gawk 'BEGIN {FS="[x+ \t]*"} /-geometry/ {print int(($3+1)/2)*2"x"int(($4+1)/2)*2"+"$5"+"$6}') ; echo ${a} ; ffmpeg -f x11grab -s ${a} -r 10 -i :0.0 -sameq -f mp4 -s wvga -y /tmp/out.mpg
2012-08-31 14:48:41
User: dwygo
Functions: echo gawk
0

Now we can capture only a specific window (we have to chose by clicking on it)

ffmpeg complains about "Frame size must be a multiple of 2" so we calculate the upper even number with (g)awk trickery.

We remove the grep, we are already using (g)awk here ....why losing time with grep !!! ;)

tcpdump -ntr NAME_OF_CAPTURED_FILE.pcap 'tcp[13] = 0x02 and dst port 80' | awk '{print $4}' | tr . ' ' | awk '{print $1"."$2"."$3"."$4}' | sort | uniq -c | awk ' {print $2 "\t" $1 }'
sudo apt-get remove $(dpkg -l|awk '/^ii linux-image-/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic*//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic*\nlinux-image-%s-generic*\n",$0,$0,$0)}')
2012-08-15 10:02:12
User: mtron
Functions: awk sed sudo
3

Remove old kernels (*-generic and *-generic-pae) via apt-get on debian/ubuntu based systems. Tested on ubuntu 10.04 - 12.04.

cat /dev/urandom|od -t x1|awk 'NR > line { pos=int(rand()*15)+2;printf("%s",$pos);line=NR+(rand()*1000);digits = digits+2 } digits == 64 { print("\n");exit }'
2012-08-14 19:02:00
User: jetdillo
Functions: awk cat exit od
1

Use this the next time you need to come up with a reasonably random bitstring, like for a WPA/WPA2 PSK or something. Takes a continuous stream of bytes coming from /dev/urandom, runs it through od(1), picking a random field ($0 and $1 excluded) from a random line and then prints it.

egrep '.*(("STATUS)|("HEAD)).*' http_access.2012.07.18.log | awk '{sum+=$11; ++n} END {print "Tot="sum"("n")";print "Avg="sum/n}'
2012-07-27 12:18:29
User: fanchok
Functions: awk egrep
0

Depending on your Apache access log configuration you may have to change the sum+=$11 to previous or next awk token.

Beware, usually in access log last token is time of response in microseconds, penultimate token is size of response in bytes. You may use this command line to calculate sum and average of responses sizes.

You can also refine the egrep regexp to match specific HTTP requests.

hostname -I
2012-07-18 19:43:48
User: bashfan
Functions: hostname
Tags: awk IP ip address
0

That's the easiest way to do it. -I (or capital i) display all network addresses of a host

ip -f inet a | awk '/inet / { print $2 }'
2012-07-18 15:13:10
User: BorneBjoern
Functions: awk
Tags: awk IP ip address
3

gives u each configured IP in a seperate line.

netstat -tn | grep :80 | awk '{print $5}'| grep -v ':80' | cut -f1 -d: |cut -f1,2,3 -d. | sort | uniq -c| sort -n
2012-06-26 08:29:37
User: krishnan
Functions: awk cut grep netstat sort uniq
0

cut -f1,2 - IP range 16

cut -f1,2,3 - IP range 24

cut -f1,2,3,4 - IP range 24

svn status | grep "^M" | while read entry; do file=`echo $entry | awk '{print $2}'`; echo $file; svn revert $file; done
2012-06-17 16:01:06
User: wsams
Functions: awk echo grep read
0

This command allows you to revert every modified file one-by-one in a while loop, but also after "echo $file;" you can do any sort of processing you might want to add before the revert happens.

awk -F'\t' '{print $0 >>$5.tsv}'
2012-05-16 18:18:16
User: pykler
Functions: awk
Tags: awk split tsv
0

Will split the std input lines into files grouped by the 5th column content.

cal 04 2012 | awk '{ $7 && X=$7 } END { print X }'
2012-05-06 23:43:21
User: flatcap
Functions: awk cal
2

If your locale has Monday as the first day of the week, like mine in the UK, change the two $7 into $6

echo `disklabel mfid1s4 | sed -n '$p' | awk '{print $2}'` / 1024 / 1024 | bc -l
cal 04 2012 | awk 'NF <= 7 { print $7 }' | grep -v "^$" | tail -1
2012-05-03 16:57:45
User: javidjamae
Functions: awk cal grep tail
-2

This is a little trickier than finding the last Sunday, because you know the last Sunday is in the first position of the last line. The trick is to use the NF less than or equal to 7 so it picks up all the lines then grep out any empty lines.

lynx -dump http://www.cooks4arab.com | awk '/http/{print $2}' | egrep "^https{0,1}"
lynx -dump http://www.domain.com | awk '/http/{print $2}' | egrep "^https{0,1}"
for k in $(git branch | sed /\*/d); do echo "$(git log -1 --pretty=format:"%ct" $k) $k"; done | sort -r | awk '{print $2}'
2012-04-07 11:19:00
User: dahuie
Functions: awk echo sed sort
Tags: bash git sed awk
0

Simpler and without all of the coloring gimmicks. This just returns a list of branches with the most recent first. This should be useful for cleaning your remotes.

ps h --ppid $(cat /var/run/apache2.pid) | awk '{print"-p " $1}' | xargs sudo strace
2012-03-21 01:59:41
Functions: awk cat ps sudo xargs
3

Like the original version except it does not include the parent apache process or the grep process and adds "sudo" so it can be run by user.

curl -s mobile.twitter.com/search | sed -n '/trend_footer_list/,/\ul>/p' | awk -F\> '{print $3}' | awk -F\< '{print $1}' | sed '/^$/d'
2012-03-15 17:17:06
User: articmonkey
Functions: awk sed
Tags: twitter awk curl
0

Prints top 5 twitter topics. Not very well written at all but none of the others worked.

find /path/to/dir -iname "*.ext" -print0 | xargs -0 mplayer -really-quiet -cache 64 -vo dummy -ao dummy -identify 2>/dev/null | awk '/ID_LENGTH/{gsub(/ID_LENGTH=/,"")}{SUM += $1}END{ printf "%02d:%02d:%02d\n",SUM/3600,SUM%3600/60,SUM%60}'
2012-03-11 12:28:48
User: DarkSniper
Functions: awk find printf xargs
0

Improvement on Coderjoe's Solution. Gets rid of grep and cut (and implements them in awk) and specifies some different mplayer options that speed things up a bit.

awk 'FNR==100 {print;exit}' file
2012-03-04 20:25:57
User: Testuser_01
Functions: awk
Tags: awk time LINES
0

This will save parsing time for operations on very big files.

awk '{cmd="date --date=\""$1"\" +\"%Y/%m/%d %H:%M:%S\" "; cmd | getline convdate; print cmd";"convdate }' file.txt
2012-02-28 14:08:52
User: EBAH
Functions: awk
1

Convert readable date/time with `date` command

print "$(lsvg -Lo |xargs lsvg -L|grep "TOTAL PPs"|awk -F"(" '{print$2}'|sed -e "s/)//g" -e "s/megabytes/+/g"|xargs|sed -e "s/^/(/g" -e "s/+$/)\/1000/g"|bc ) GB"
2012-02-03 13:58:41
0

Not figured by me, but a colleague of mine.

See the total amount of data on an AIX machine.

sed -r 's/(\[|])//g' | awk ' { $1=strftime("%D %T",$1); print }'
2012-02-03 13:07:37
User: Zulu
Functions: awk sed
Tags: sed awk timestamp
0

It remove the square bracket and convert UNIX time to human readable time for all line of a stream (or file).