Commands tagged awk (305)

  • awk can clear the screen while displaying output. This is a handy way of seeing how many lines a tail -f has hit or see how many files find has found. On solaris, you may have to use 'nawk' and your machine needs 'tput' Show Sample Output


    0
    cat /dev/urandom|awk 'BEGIN{"tput cuu1" | getline CursorUp; "tput clear" | getline Clear; printf Clear}{num+=1;printf CursorUp; print num}'
    axelabs · 2009-07-13 07:30:51 0
  • Basically it creates a typical word list file from any normal text.


    1
    awk '{c=split($0, s); for(n=1; n<=c; ++n) print s[n] }' INPUT_FILE > OUTPUT_FILE
    agony · 2009-07-06 06:10:21 3
  • Helpful when we want to do mass file renaming(especially mp3s). Show Sample Output


    0
    echo "${STRING}" | tr '[A-Z]' '[a-z]' | awk '{print toupper(substr($0,1,1))substr($0,2);}'
    mohan43u · 2009-06-23 21:11:34 5
  • You need sysstat and gawk for this to work. Show Sample Output


    0
    for x in `seq -w 1 30`; do sar -b -f /var/log/sa/sa$x | gawk '/Average/ {print $2}'; done
    unixmonkey4319 · 2009-06-17 02:46:52 0
  • just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module


    4
    svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
    jemptymethod · 2009-06-05 14:07:28 3

  • 5
    awk '/match/{print NR}' file
    inof · 2009-06-03 17:34:13 3
  • Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob? It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt. I ran into this today in that my only copy of a CSS file was "compressed" with no newlines. I whipped this up and it converted back into nice human readable CSS :-) It could be nicer, but it does the job.


    1
    cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
    lrvick · 2009-06-02 15:51:51 5
  • This appends a random number as a first filed of all lines in SOMEFILE then sorts by the first column and finally cuts of the random numbers.


    4
    awk 'BEGIN{srand()}{print rand(),$0}' SOMEFILE | sort -n | cut -d ' ' -f2-
    axelabs · 2009-05-29 01:20:50 4
  • Sometimes jittery data hides trends, performing a rolling average can give a clearer view.


    3
    awk 'BEGIN{size=5} {mod=NR%size; if(NR<=size){count++}else{sum-=array[mod]};sum+=$1;array[mod]=$1;print sum/count}' file.dat
    mungewell · 2009-05-29 00:07:24 0
  • exported files will get a .r23 extension (where 23 is the revision number)


    2
    svn log fileName|cut -d" " -f 1|grep -e "^r[0-9]\{1,\}$"|awk {'sub(/^r/,"",$1);print "svn cat fileName@"$1" > /tmp/fileName.r"$1'}|sh
    fizz · 2009-05-27 02:11:58 4
  • This command might not be useful for most of us, I just wanted to share it to show power of command line. Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination. This command removes numbers and single characters from count. I'm sure you can write a shorter version. Show Sample Output


    -4
    wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
    alperyilmaz · 2009-05-04 16:00:39 8
  • % cat ph-vmstat.awk # Return human readable numbers function hrnum(a) { b = a ; if (a > 1000000) { b = sprintf("%2.2fM", a/1000000) ; } else if (a > 1000) { b = sprintf("%2.2fK", a/1000) ; } return(b) ; } # Return human readable storage function hrstorage(a) { b = a ; if (a > 1024000) { b = sprintf("%2.2fG", a/1024/1024) ; } else if (a > 1024) { b = sprintf("%2.2fM", a/1024) ; } return(b) ; } OFS=" " ; $1 !~ /[0-9].*/ {print} $1 ~ /[0-9].*/ { $4 = hrstorage($4) ; $5 = hrstorage($5) ; $9 = hrnum($9) ; $10 = hrnum($10) ; $17 = hrnum($17) ; $18 = hrnum($18) ; $19 = hrnum($19) ; print ; } Show Sample Output


    5
    vmstat 1 10 | /usr/xpg4/bin/awk -f ph-vmstat.awk
    MarcoN · 2009-05-04 04:55:00 0
  • Ok so it's rellay useless line and I sorry for that, furthermore that's nothing optimized at all... At the beginning I didn't managed by using netstat -p to print out which process was handling that open port 4444, I realize at the end I was not root and security restrictions applied ;p It's nevertheless a (good ?) way to see how ps(tree) works, as it acts exactly the same way by reading in /proc So for a specific port, this line returns the calling command line of every thread that handle the associated socket


    -5
    p=$(netstat -nate 2>/dev/null | awk '/LISTEN/ {gsub (/.*:/, "", $4); if ($4 == "4444") {print $8}}'); for i in $(ls /proc/|grep "^[1-9]"); do [[ $(ls -l /proc/$i/fd/|grep socket|sed -e 's|.*\[\(.*\)\]|\1|'|grep $p) ]] && cat /proc/$i/cmdline && echo; done
    j0rn · 2009-04-30 12:39:48 1
  • This will, for an application that has already been removed but had its configuration left behind, purge that configuration from the system. To test it out first, you can remove the last -y, and it will show you what it will purge without actually doing it. I mean it never hurts to check first, "just in case." ;)


    -3
    dpkg-query -l| grep -v "ii " | grep "rc " | awk '{print $2" "}' | tr -d "\n" | xargs aptitude purge -y
    thepicard · 2009-04-28 19:25:53 3
  • Useful for removes a package and its depends, for example to remove the gnome desktop environment, also configuration files will be removed, you should be carefully and sure that you want to do this. Show Sample Output


    -4
    sudo apt-get remove --purge `dpkg -l | awk '{print $2}' | grep gnome` && apt-get autoremove
    kelevra · 2009-04-28 10:34:42 7
  • This is useful if you have need to do port forwarding and your router doesn't assign static IPs, you can add it to a script in a cron job that checks if you IP as recently changed or with a trigger script. This was tested on Mac OSX.


    9
    ifconfig en1 | awk '/inet / {print $2}' | mail -s "hello world" email@email.com
    rez0r · 2009-04-28 06:01:52 4
  • Lists the local files that are not present in the remote repository (lines beginning with ?) and add them. Show Sample Output


    1
    svn status | grep '^?' | awk '{ print $2; }' | xargs svn add
    unixmonkey4200 · 2009-04-10 21:55:37 2
  • This command will sort the contents of FILENAME by redirecting the output to individual .txt files in which 3rd column will be used for sorting. If FILENAME contents are as follows: foo foo A foo bar bar B bar lorem ipsum A lorem Then two files called A.txt and B.txt will be created and their contents will be: A.txt foo foo A foo lorem ipsum A lorem and B.txt will be bar bar B bar


    2
    awk '{print > $3".txt"}' FILENAME
    alperyilmaz · 2009-03-31 15:14:13 0
  • Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount. BSD Version: netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr Show Sample Output


    14
    netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
    tiagofischer · 2009-03-28 21:02:26 4

  • 6
    awk '{sum+=$1; sumsq+=$1*$1} END {print sqrt(sumsq/NR - (sum/NR)**2)}' file.dat
    kaan · 2009-03-24 21:56:40 4
  • The arguments of "seq" indicate the starting value, step size, and the end value of the x-range. "awk" outputs (x, f(x)) pairs and pipes them to "graph", which is part of the "plotutils" package.


    2
    seq 0 0.1 20 | awk '{print $1, cos(0.5*$1)*sin(5*$1)}' | graph -T X
    kaan · 2009-03-24 21:46:59 3
  • Displays six rows and five columns of random numbers between 0 and 1. If you need only one column, you can dispense with the "for" loop. Show Sample Output


    3
    seq 6 | awk '{for(x=1; x<=5; x++) {printf ("%f ", rand())}; printf ("\n")}'
    kaan · 2009-03-24 21:33:38 0
  • This example calculates the averages of column one and column two of "file.dat". It can be easily modified if other columns are to be averaged.


    2
    awk '{sum1+=$1; sum2+=$2} END {print sum1/NR, sum2/NR}' file.dat
    kaan · 2009-03-24 21:22:14 1
  • Another combination of seq and awk. Not very efficient, but sufficiently quick. Show Sample Output


    14
    seq 50| awk 'BEGIN {a=1; b=1} {print a; c=a+b; a=b; b=c}'
    kaan · 2009-03-24 20:39:24 1
  • "seq 100" outputs 1,2,..,100, separated by newlines. awk adds them up and displays the sum. "seq 1 2 11" outputs 1,3,..,11. Variations: 1+3+...+(2n-1) = n^2 seq 1 2 19 | awk '{sum+=$1} END {print sum}' # displays 100 1/2 + 1/4 + ... = 1 seq 10 | awk '{sum+=1/(2**$1)} END {print sum}' # displays 0.999023 Show Sample Output


    4
    seq 100 | awk '{sum+=$1} END {print sum}'
    kaan · 2009-03-24 20:30:40 0
  • ‹ First  < 10 11 12 13 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: