Commands using awk (1,402)

  • This command takes the output of the 'last' command, removes empty lines, gets just the first field ($USERNAME), sort the $USERNAMES in reverse order and then gives a summary count of unique matches. Show Sample Output


    15
    last | grep -v "^$" | awk '{ print $1 }' | sort -nr | uniq -c
    hkyeakley · 2009-02-18 16:38:59 1
  • This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc). Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here! The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable"). If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...). Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)... sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})' (depending on sudo settings it might be best to run sudo su first to get to a # prompt)


    15
    sudo cat /proc/kcore | strings | awk 'length > 20' | less
    nesquick · 2009-03-09 02:19:47 5
  • This is a very simple and lightweight way to play DI.FM stations For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above) zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start dependencies: - x11 with gtk environment - zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu) - mplayer: simple audio player (sudo apt-get install mplayer on ubuntu) Show Sample Output


    15
    zenity --list --width 500 --height 500 --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer
    polaco · 2010-04-28 23:45:35 7
  • Just a simple way without the need of additional tools. Of course, replace eth0 with your IF. Show Sample Output


    15
    while [ /bin/true ]; do OLD=$NEW; NEW=`cat /proc/net/dev | grep eth0 | tr -s ' ' | cut -d' ' -f "3 11"`; echo $NEW $OLD | awk '{printf("\rin: % 9.2g\t\tout: % 9.2g", ($1-$3)/1024, ($2-$4)/1024)}'; sleep 1; done
    hons · 2011-03-22 10:02:23 3
  • parse "lsmod" output to "dot" format and pass it to "display". Without perl!


    15
    lsmod | awk 'BEGIN{print "digraph{"}{split($4, a, ","); for (i in a) print $1, "->", a[i]}END{print "}"}'|display
    point_to_null · 2011-12-04 01:41:23 2
  • Another combination of seq and awk. Not very efficient, but sufficiently quick. Show Sample Output


    14
    seq 50| awk 'BEGIN {a=1; b=1} {print a; c=a+b; a=b; b=c}'
    kaan · 2009-03-24 20:39:24 2
  • Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount. BSD Version: netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr Show Sample Output


    14
    netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
    tiagofischer · 2009-03-28 21:02:26 5
  • Plot your most used commands with gnuplot.


    14
    history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head > /tmp/cmds | gnuplot -persist <(echo 'plot "/tmp/cmds" using 1:xticlabels(2) with boxes')
    sthrs · 2010-06-13 23:35:13 2

  • 14
    curl -s https://api.github.com/users/<username>/repos?per_page=1000 |grep git_url |awk '{print $2}'| sed 's/"\(.*\)",/\1/'
    wuziduzi · 2019-11-19 20:31:19 6
  • Purge all configuration files of removed packages Show Sample Output


    13
    sudo aptitude purge `dpkg --get-selections | grep deinstall | awk '{print $1}'`
    kelevra · 2009-04-28 11:44:04 5

  • 13
    lynx -dump http://www.domain.com | awk '/http/{print $2}'
    putnamhill · 2010-09-04 12:48:19 3
  • i'm using gawk, you may get varying mileage with other varieties. You might want to change the / after du to say, /home/ or /var or something, otherwise this command might take quite some time to complete. Sorry it's so obsfucated, I had to turn a script into a one-liner under 255 characters for commandlinefu. Note: the bar ratio is relative, so the highest ratio of the total disk, "anchors" the rest of the graph. EDIT: the math was slightly wrong, fixed it. Also, made it compliant with older versions of df. Show Sample Output


    13
    t=$(df|awk 'NR!=1{sum+=$2}END{print sum}');sudo du / --max-depth=1|sed '$d'|sort -rn -k1 | awk -v t=$t 'OFMT="%d" {M=64; for (a=0;a<$1;a++){if (a>c){c=a}}br=a/c;b=M*br;for(x=0;x<b;x++){printf "\033[1;31m" "|" "\033[0m"}print " "$2" "(a/t*100)"% total"}'
    kevinquinnyo · 2011-12-01 01:21:11 9
  • Self-referential use of wget. Show Sample Output


    12
    wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]"\n"} '
    root · 2009-01-30 19:16:50 7
  • Rather than chain a string of greps together and pipe them to awk, use awk to do all the work. In the above example, a string would be output to stdout if it matched pattern1 AND pattern2, but NOT pattern3.


    12
    awk '/pattern1/ && /pattern2/ && !/pattern3/ {print}'
    themensch · 2009-02-05 15:18:19 3
  • usefull in case of abuser/DoS attacks. Show Sample Output


    12
    netstat -anp |grep 'tcp\|udp' | awk '{print $5}' | sed s/::ffff:// | cut -d: -f1 | sort | uniq -c | sort -n
    dt · 2009-02-15 09:16:16 2
  • A variation of a script I found on this site and then slimmed down to just use awk. It displays all users who have attempted to login to the box and failed using SSH. Pipe it to the sort command to see which usernames have the most failed logins. Show Sample Output


    12
    awk '/sshd/ && /Failed/ {gsub(/invalid user/,""); printf "%-12s %-16s %s-%s-%s\n", $9, $11, $1, $2, $3}' /var/log/auth.log
    frailotis · 2009-04-16 00:56:23 0
  • simple function , floating point number is supported.


    12
    calc(){ awk "BEGIN{ print $* }" ;}
    twfcc · 2009-10-23 06:03:07 2
  • Proper screencast with audio using ffmpeg and x264, as per http://verb3k.wordpress.com/2010/01/26/how-to-do-proper-screencasts-on-linux/


    12
    ffmpeg -y -f alsa -ac 2 -i pulse -f x11grab -r 30 -s `xdpyinfo | grep 'dimensions:'|awk '{print $2}'` -i :0.0 -acodec pcm_s16le output.wav -an -vcodec libx264 -vpre lossless_ultrafast -threads 0 output.mp4
    NoahY · 2010-11-19 09:31:56 2

  • 12
    apt-get install `ssh root@host_you_want_to_clone "dpkg -l | grep ii" | awk '{print $2}'`
    TuxOtaku · 2011-05-10 13:33:51 3
  • Awk replaces every instance of foo with bar in the 5th column only.


    12
    awk '{gsub("foo","bar",$5)}1' file
    zlemini · 2011-11-09 18:24:23 0
  • ..not guaranteed to always be accurate but fun to see how old you Linux installation is based on the root partitions file system creation date. Show Sample Output


    12
    sudo tune2fs -l $(df -h / |(read; awk '{print $1; exit}')) | grep -i created
    thechile · 2013-08-08 15:18:09 5
  • using cat WAR_AND_PEACE_By_LeoTolstoi.txt | tr -cs "[:alnum:]" "\n"| tr "[:lower:]" "[:upper:]" | sort -S16M | uniq -c |sort -nr | cat -n | head -n 30 ("sort -S1G" - Linux/GNU sort only) will also do the job but as some drawbacks (caused by space/time complexity of sorting) for bigger files... Show Sample Output


    11
    cat WAR_AND_PEACE_By_LeoTolstoi.txt | tr -cs "[:alnum:]" "\n"| tr "[:lower:]" "[:upper:]" | awk '{h[$1]++}END{for (i in h){print h[i]" "i}}'|sort -nr | cat -n | head -n 30
    cp · 2010-07-05 06:39:20 5
  • Here's an awk alternative, for those lacking the version of cut with the --complement argument.


    11
    awk '{ $5=""; print }' file
    zlemini · 2010-10-22 09:48:49 0
  • It identifies the parents of the Zombie processes and kill them. So the new parent of orphan Zombies will be the Init process and he is already waiting for reaping them. Be careful! It may also kill your useful processes just because they are not taking care and waiting for their children (bad parents!). Show Sample Output


    11
    kill -9 `ps -xaw -o state -o ppid | grep Z | grep -v PID | awk '{print $2}'`
    khashmeshab · 2010-10-27 07:29:14 7
  • ... plus do a sort according frequency Show Sample Output


    11
    find . -type f | awk -F'.' '{print $NF}' | sort| uniq -c | sort -g
    cp · 2011-02-14 09:15:29 0
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Check disk for bad sectors
Checks HDD for bad sectors, just like scandisk or chkdisk under some other operating system ;-).

Install pip with Proxy
Installs pip packages defining a proxy

syncronizing datas beetween two folder (A and B) excluding some directories in A (dir1 and dir2)

Query Wikipedia via console over DNS
Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry.You can also write a little shell script like: $ $ cat wikisole.sh $ #!/bin/sh $ dig +short txt ${1}.wp.dg.cx and run it like $ ./wikisole.sh unix were your first option ($1) will be used as search term.

Edit a file on a remote host using vim

easily strace all your apache processes
This one-liner will use strace to attach to all of the currently running apache processes output and piped from the initial "ps auxw" command into some awk.

Find files changed between dates defined by ctime of two files specified by name
This command finds all the files whose status has changed between the ctime of the older and newer . Very useful if you can see from an ls listing a block of consecutive files you want to move or delete, but can't figure out exactly the time range by date.

Find the package that installed a command

Run entire shell script as root
Placing sudo in the shebang line of a shell script runs the entire thing as root. Useful for scripts designed to, e.g. automate system upgrades or package manager wrappers — makes prepending everything with sudo no longer necessary

Get your current Public IP


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: