Commands using awk (1,418)

  • Displays a connection histogram of active tcp connections. Works even better under an alias. Thanks @Areis1 for sharing this one.


    7
    netstat -an | grep ESTABLISHED | awk '\''{print $5}'\'' | awk -F: '\''{print $1}'\'' | sort | uniq -c | awk '\''{ printf("%s\t%s\t",$2,$1); for (i = 0; i < $1; i++) {printf("*")}; print ""}'\''
    mramos · 2010-07-09 00:25:45 42
  • Note the double space: "...^ii␣␣linux-image-2..." Like 5813, but fixes two bugs: [1]This leaves the meta-packages 'linux-headers-generic' and 'linux-image-generic' alone so that automatic upgrades work correctly in the future. [2]Kernels newer than the currently running one are left alone (this can happen if you didn't reboot after installing a new kernel). I'm bummed that this took 228 characters. I'd like to see a simpler version. Show Sample Output


    7
    aptitude remove $(dpkg -l|awk '/^ii linux-image-2/{print $2}'|sed 's/linux-image-//'|awk -v v=`uname -r` 'v>$0'|sed 's/-generic//'|awk '{printf("linux-headers-%s\nlinux-headers-%s-generic\nlinux-image-%s-generic\n",$0,$0,$0)}')
    __ · 2010-12-11 11:38:15 7
  • This one-liner will use strace to attach to all of the currently running apache processes output and piped from the initial "ps auxw" command into some awk. Show Sample Output


    7
    ps auxw | grep sbin/apache | awk '{print"-p " $2}' | xargs strace
    px · 2011-03-14 21:45:22 34
  • limite = threshold Show Sample Output


    7
    limite="5";load5=$(awk '{print $1}' /proc/loadavg);echo "http://chart.apis.google.com/chart?chxr=0,0,5&chxt=y&chs=700x240&cht=gm&chds=0,"$limite"&chd=t:"$load5"&chl="$load5"&chtt=$(hostname)+load+average"
    ncaio · 2011-04-02 17:55:24 8
  • Imagine you've started a long-running process that involves piping data, but you forgot to add the progress-bar option to a command. e.g. xz -dc bigdata.xz | complicated-processing-program > summary . This command uses lsof to see how much data xz has read from the file. lsof -o0 -o -Fo FILENAME Display offsets (-o), in decimal (-o0), in parseable form (-Fo) This will output something like: . p12607 f3 o0t45187072 . Process id (p), File Descriptor (f), Offset (o) . We stat the file to get its size stat -c %s FILENAME . Then we plug the values into awk. Split the line at the letter t: -Ft Define a variable for the file's size: -s=$(stat...) Only work on the offset line: /^o/ . Note this command was tested using the Linux version of lsof. Because it uses lsof's batch option (-F) it may be portable. . Thanks to @unhammer for the brilliant idea. Show Sample Output


    7
    F=bigdata.xz; lsof -o0 -o -Fo $F | awk -Ft -v s=$(stat -c %s $F) '/^o/{printf("%d%%\n", 100*$2/s)}'
    flatcap · 2015-09-19 22:22:43 18

  • 6
    echo "SHOW PROCESSLIST\G" | mysql -u root -p | grep "Info:" | awk -F":" '{count[$NF]++}END{for(i in count){printf("%d: %s\n", count[i], i)}}' | sort -n
    bpfx · 2009-02-05 14:24:20 40
  • Sometimes apache will get stuck in an established state where you can't get a list of the connecting IP's from mod_status... not a good thing when you need to ban an abusive ip.


    6
    for i in `ps aux | grep httpd | awk '{print $2}'`; do lsof -n -p $i | grep ESTABLISHED; done;
    Shadow · 2009-02-05 17:50:52 67
  • Do ls with permissions written in octal form. Show Sample Output


    6
    ls -l | awk '{k=0;for(i=0;i<=8;i++)k+=((substr($1,i+2,1)~/[rwx]/)*2^(8-i));if(k)printf("%0o ",k);print}'
    iArno · 2009-02-13 19:20:54 20

  • 6
    awk '{sum+=$1; sumsq+=$1*$1} END {print sqrt(sumsq/NR - (sum/NR)**2)}' file.dat
    kaan · 2009-03-24 21:56:40 9

  • 6
    du -sb *|sort -nr|head|awk '{print $2}'|xargs du -sh
    srand · 2009-05-12 19:58:45 9
  • This is meant for the bash shell. Put this function in your .profile and you'll be able to use tab-completion for sshing any host which is in your known_hosts file. This assumes that your known_hosts file is located at ~/.ssh/known_hosts. The "complete" command should go on a separate line as such: function autoCompleteHostname() { local hosts=($(awk '{print $1}' ~/.ssh/known_hosts | cut -d, -f1)); local cur=${COMP_WORDS[COMP_CWORD]}; COMPREPLY=($(compgen -W '${hosts[@]}' -- $cur )) } complete -F autoCompleteHostname ssh


    6
    function autoCompleteHostname() { local hosts; local cur; hosts=($(awk '{print $1}' ~/.ssh/known_hosts | cut -d, -f1)); cur=${COMP_WORDS[COMP_CWORD]}; COMPREPLY=($(compgen -W '${hosts[@]}' -- $cur )) } complete -F autoCompleteHostname ssh
    sbisordi · 2009-05-17 23:12:34 10
  • Check which files are opened by Firefox then sort by largest size (in MB). You can see all files opened by just replacing grep to "/". Useful if you'd like to debug and check which extensions or files are taking too much memory resources in Firefox. Show Sample Output


    6
    FFPID=$(pidof firefox-bin) && lsof -p $FFPID | awk '{ if($7>0) print ($7/1024/1024)" MB -- "$9; }' | grep ".mozilla" | sort -rn
    josue · 2009-08-16 08:58:22 7
  • This is an on-line algorithm for calculating the mean value for numbers in a column. Also known as "running average" or "moving average".


    6
    awk '{avg += ($1 - avg) / NR;} END { print avg; }'
    ashawley · 2009-09-10 17:06:03 7

  • 6
    (echo "set terminal png;plot '-' u 1:2 t 'cpu' w linespoints;"; sudo vmstat 2 10 | awk 'NR > 2 {print NR, $13}') | gnuplot > plot.png
    grokskookum · 2009-09-23 16:40:13 12

  • 6
    awk '{print length, $0;}' | sort -nr
    ashawley · 2009-10-07 16:32:27 4
  • Change the name of the process and what is echoed to suit your needs. The brackets around the h in the grep statement cause grep to skip over "grep httpd", it is the equivalent of grep -v grep although more elegant. Show Sample Output


    6
    TOTAL_RAM=`free | head -n 2 | tail -n 1 | awk '{ print $2 }'`; PROC_RSS=`ps axo rss,comm | grep [h]ttpd | awk '{ TOTAL += $1 } END { print TOTAL }'`; PROC_PCT=`echo "scale=4; ( $PROC_RSS/$TOTAL_RAM ) * 100" | bc`; echo "RAM Used by HTTP: $PROC_PCT%"
    d34dh0r53 · 2010-02-26 20:29:45 8
  • Uses lsof to display the full path of ".log" files opened by a specified PID.


    6
    lsof -p 1234 | grep -E "\.log$" | awk '{print $NF}'
    zlemini · 2010-03-05 11:41:28 5

  • 6
    ffmpeg -f x11grab -s `xdpyinfo | grep 'dimensions:'|awk '{print $2}'` -r 25 -i :0.0 -sameq /tmp/out.mpg > /root/howto/capture_screen_video_ffmpeg
    brubaker · 2010-03-27 21:31:34 5
  • enumerates the number of processes for each user. ps BSD format is used here , for standard Unix format use : ps -eLf |awk '{$1} {++P[$1]} END {for(a in P) if (a !="UID") print a,P[a]}' Show Sample Output


    6
    ps aux |awk '{$1} {++P[$1]} END {for(a in P) if (a !="USER") print a,P[a]}'
    benyounes · 2010-04-28 15:25:18 3
  • Most people take photos in landscape orientation (wider than it is tall). Sometimes though you turn the camera sideways to capture a narrow/tall subject. Assuming you then manually rotate those picture files 90 degrees for proper viewing on screen or photo frame, you now have a mix of orientations in your photos directory. This command will print out the names of all the photos in the current directory whose vertical resolution is larger than its horizontal resolution (i.e. portrait orientation). You can then take that list of files and deal with them however you need to, like re-rotating back to landscape for consistent printing with all the others. This command requires the "identify" command from the ImageMagick command-line image manipulation suite. Sample output from identify: identify PICT2821.JPG PICT2821.JPG JPEG 1536x2048 1536x2048+0+0 8-bit DirectClass 688KB 0.016u 0:00.006 Show Sample Output


    6
    for i in *; do identify $i | awk '{split($3,a,"x"); if (a[2]>a[1]) print $1;}'; done
    dmmst19 · 2014-05-27 23:41:24 10

  • 6
    find /glftpd/site/archive -type f|grep '([0-9]\{1,9\})\.[^.]\+$'|parallel -n1 -j200% md5sum ::: |awk 'x[$1]++ { print $2 " :::"}'|sed 's/^/Dupe: /g'|sed 's,Dupe,\x1B[31m&\x1B[0m,'
    wuseman1 · 2019-10-22 16:02:15 139
  • A copy of all installed debian packages on your system will be put back together, with all changes in configuration files you made and placed in the current directory. Make sure you have enough disk space (say 2-3 GB). Break any time with Ctrl+C. Show Sample Output


    6
    for a in $(sudo dpkg --get-selections|cut -f1); do dpkg-repack $a|awk '{if (system("sleep .5 && exit 2") != 2) exit; print}';done
    knoppix5 · 2021-01-17 17:07:02 323

  • 5
    awk '{count[length]++}END{for(i in count){printf("%d: %d\n", count[i], i)}}'
    bpfx · 2009-02-05 14:20:39 26
  • This command specifies the size in Kilobytes using 'k' in the -size +(N)k option. The plus sign says greater than. -exec [cmd] {} \; invokes ls -l command on each file and awk strips off the values of the 5th (size) and the 9th (filename) column from the ls -l output to display. Sort is done in reversed order (descending) numerically using sort -rn options. A cron job could be run to execute a script like this and alert the users if a dir has files exceeding certain size, and provide file details as well. Show Sample Output


    5
    find . -size +10240k -exec ls -l {} \; | awk '{ print $5,"",$9 }'|sort -rn > message.out
    rommelsharma · 2009-02-17 19:39:56 10
  • Note that the file at the given path will have the contents of the (still) deleted file, but it is a new file with a new node number; in other words, this restores the data, but it does not actually "undelete" the old file. I posted a function declaration encapsulating this functionality to http://www.reddit.com/r/programming/comments/7yx6f/how_to_undelete_any_open_deleted_file_in_linux/c07sqwe (please excuse the crap formatting).


    5
    N="filepath" ; P=/proc/$(lsof +L1 | grep "$N" | awk '{print $2}')/fd ; ls -l $P | sed -rn "/$N/s/.*([0-9]+) ->.*/\1/p" | xargs -I_ cat $P/_ > "$N"
    laburu · 2009-02-21 02:31:24 28
  • ‹ First  < 3 4 5 6 7 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Rename files in batch

Download an Entire website with wget

Set laptop display brightness
Run as root. Path may vary depending on laptop model and video card (this was tested on an Acer laptop with ATI HD3200 video). $ cat /proc/acpi/video/VGA/LCD/brightness to discover the possible values for your display.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

a short counter
Maybe you know shorter ?

Log colorizer for OSX (ccze alternative)
Download colorizer by @raszi @ http://github.com/raszi/colorize

add all files not under version control to repository
This should handle whitespaces well and will not get confused if your filenames have "?" in them

back ssh from firewalled hosts
host B (you) redirects a modem port (62220) to his local ssh. host A is a remote machine (the ones that issues the ssh cmd). once connected port 5497 is in listening mode on host B. host B just do a ssh 127.0.0.1 -p 5497 -l user and reaches the remote host'ssh. This can be used also for vnc and so on.

Happy Days
AFAIR this is the wording ;)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: