Commands tagged awk (348)


  • 0
    % sudo yum remove streams-$(uname-r)
    totalnut151 · 2013-11-21 17:41:19 6
  • It is not the installed size in files, but the size of RPM packages. Show Sample Output


    0
    rpm -qa --queryformat '%{SIZE}\n' | awk '{sum += $1} END {printf("Total size in packages = %4.1f GB\n", sum/1024**3)}'
    skytux · 2013-12-14 20:22:41 10

  • 0
    while [ 1 ] ;do ps aux|awk '{if ($8 ~ "D") print }'; sleep 1 ;done
    paulp · 2014-01-21 08:20:04 6
  • Grep can search files and directories recursively. Using the -Z option and xargs -0 you can get all results on one line with escaped spaces, suitable for other commands like rm. Show Sample Output


    0
    grep -Rl "pattern" files_or_dir
    N1nsun · 2014-04-06 18:18:07 7

  • 0
    grep URL ~/annex/.git/annex/webapp.html | tr -d '">' | awk -F= '{print $4 "=" $5}'
    kseistrup · 2014-04-20 08:46:37 8

  • 0
    ip route list 0/0
    thrix · 2014-06-09 16:07:38 7

  • 0
    mco ping | head -n -4 | awk '{print $1}' | sort
    mrwulf · 2014-06-24 18:20:16 7
  • Original command: cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100 This is a waste of multiple cats and greps, esp when awk is being used


    0
    awk '/text to grep/{print $1}' "log" | sort -n | uniq -c | sort -rn | head -n 100
    kln0thing · 2014-07-09 08:48:06 9
  • The AWK part of the code will "collate" the fields from 2nd to Nth field (this is to handle any svn directories that may have spaces in them - typical when working with code that is interchangeably used with windows environment - for example, documentation teams) - the output is passed to "ls -ld" - the -d option to ls will tell ls to handle directories itself, rather than do ls on the directory. The '-p' option is just for pretty printing directories, links and executables (for added readability). Finally, the entire "constructed" command will be passed onto sh for shell execution. Show Sample Output


    0
    svn status | awk -F" " '{ for (i=2; i<=NF; i++) print "ls -ld \""$i"\""}' | sh
    kln0thing · 2014-07-09 09:41:24 14
  • Gets the Hardware UUID of the current machine using system_profiler. Show Sample Output


    0
    system_profiler SPHardwareDataType | awk '/UUID/ { print $3; }'
    thealanberman · 2014-07-25 06:54:40 8
  • This command makes a small graph with the histogram of size blocks (5MB in this example), not individual files. Fine tune the 4+5*int($1/5) block for your own size jumps : jump-1+jump*($1/jump) Also in the hist=hist-5 part, tune for bigger or smaller graphs Show Sample Output


    0
    du -sm *| sort -nr | awk '{ size=4+5*int($1/5); a[size]++ }; END { print "size(from->to) number graph"; for(i in a){ printf("%d %d ",i,a[i]) ; hist=a[i]; while(hist>0){printf("#") ; hist=hist-5} ; printf("\n")}}'
    higuita · 2014-08-19 14:43:20 8
  • Caution: distructive overwrite of filenames Useful for concatenating pdfs in date order using pdftk


    0
    find . -name "*.pdf" -print0 | xargs -r0 stat -c %y\ %n | sort|awk '{print $4}'|gawk 'BEGIN{ a=1 }{ printf "mv %s %04d.pdf\n", $0, a++ }' | bash
    Randy_Legault · 2014-09-23 06:40:45 9
  • Its possible to user a simple regex to extract de username from the finger command. The final echo its optional, just for remove the initial space Show Sample Output


    0
    finger $(whoami) | egrep -o 'Name: [a-zA-Z0-9 ]{1,}' | cut -d ':' -f 2 | xargs echo
    swebber · 2014-09-24 01:22:07 9
  • Given a hosts list, ssh one by one and echo its name only if 'processname' is not running. Show Sample Output


    0
    for i in `cat hosts_list`; do RES=`ssh myusername@${i} "ps -ef " |awk '/[p]rocessname/ {print $2}'`; test "x${RES}" = "x" && echo $i; done
    arlequin · 2014-10-03 14:57:54 9
  • This is useful as a git hook to print out the directories that had files changed on a commit. Each directory is its own package. Show Sample Output


    0
    git log -n 1 --name-only --pretty=oneline | awk -F/ 'NR>=2 {seen[$1]}; END {for (d in seen); print d}'
    Romster · 2014-12-13 10:21:46 9
  • The sample output shows each record/row with the last field zero-padded to 26 digits. For testing, I used (L)ine and field/column numbers.... Line 4, field2 = L42, etc up to the last field where I just used line numbers X 4. I had some whitespace-delimited files with variable-length records/rows (having 4 - 5 fields/columns) which required reformatting by zero-padding the last field to 26 digits. This requires setting NF (Not $NF) as an awk variable, with a simple conditional that assumes that any line where (N)umber of (F)ields does NOT equal 4 has a NF of 5. If needed, more conditional checks can be added, and the "NF" changed to any field ($1, $5, etc). Show Sample Output


    0
    awk '{var = sprintf(NF); if (var == 4) printf "%s %s %s %026d\n" , $1,$2,$3,$4; else printf "%s %s %s %s %026d\n" , $1,$2,$3,$4,$5}' yourfilegoes.here >> yournewfilegoes.here
    genatomics · 2014-12-20 02:53:35 8
  • Use this command to watch apache access logs in real time to see what pages are getting hit. Show Sample Output


    0
    tail -f access_log | awk '{print $1 , $12}'
    tyzbit · 2014-12-24 14:15:52 11

  • 0
    sudo du -kx / |sort -n| awk '{print $1/(1000*1000) " G" ,$2}'
    umiyosh · 2015-01-05 04:49:24 8
  • OSX users as well as linux users with copy/paste buffer commands can remove duplicate items from their copy buffer with this command. I use this often when I have to copy a long list of items that I didn't generate, but I need to paste elsewhere in a list that's unique. If retaining the original order of lines isn't important to you, use the following command which is easier to remember. pbpaste | sort | uniq | pbcopy


    0
    pbpaste | awk ' !x[$0]++' | pbcopy
    dmengelt · 2015-02-05 19:38:38 12
  • us lsof, grep for any pid matching a given name such as "node". Show Sample Output


    0
    lsof -i -n -P | grep -e "$(ps aux | grep node | grep -v grep | awk -F' ' '{print $2}' | xargs | awk -F' ' '{str = $1; for(i = 2; i < NF; i++) {str = str "\\|" $i} print str}')"
    hochmeister · 2015-02-14 23:24:00 10
  • Replace grep | sed with single awk script.


    0
    watch -n10 -d sh -c 'sensors | awk '\''/:.*RPM/ { sub("[^:]*:","") ; print $1 }'\'
    my_username · 2015-04-29 16:50:28 10

  • 0
    pgrep -f /usr/sbin/httpd | awk '{print"-p " $1}' | xargs strace
    savagemike · 2015-06-10 22:55:35 12
  • Removes directories which are less than 1028KB total. This works for systems where blank directories are 4KB. If a directory contains 1 MB (1024KB) or less, it will remove the directory using a path relative to the directory where the command was initially executed (safer than some other options I found). Adjust the 1028 value for your needs. It would be helpful to test the results before proceeding with the removal. Simply run all but the last two commands to see a list of what will be removed: du | awk '{if($1<1028)print;}' | cut -d $'\t' -f 2- If you're unsure what size a blank folder is, test it like this: mkdir test; du test; rmdir test


    0
    du | awk '{if($1<1028)print;}' | cut -d $'\t' -f 2- | tr "\n" "\0" | xargs -0 rm -rf
    i814u2 · 2015-06-25 16:00:48 11
  • Don't want to open up an editor just to view a bunch of XML files in an easy to read format? Now you can do it from the comfort of your own command line! :-) This creates a new function, xmlpager, which shows an XML file in its entirety, but with the actual content (non-tag text) highlighted. It does this by setting the foreground to color #4 (red) after every tag and resets it before the next tag. (Hint: try `tput bold` as an alternative). I use 'xmlindent' to neatly reflow and indent the text, but, of course, that's optional. If you don't have xmlindent, just replace it with 'cat'. Additionally, this example shows piping into the optional 'less' pager; note the -r option which allows raw escape codes to be passed to the terminal. Show Sample Output


    0
    xmlpager() { xmlindent "$@" | awk '{gsub(">",">'`tput setf 4`'"); gsub("<","'`tput sgr0`'<"); print;} END {print "'`tput sgr0`'"}' | less -r; }
    hackerb9 · 2015-07-12 09:22:10 11

  • 0
    eval `cli53 list |grep Name | sed "s/\.$//g" | awk '{printf("echo %s; cli53 export %s > %s;\n", $2, $2, $2);}'`
    cfb · 2015-07-21 14:16:30 10
  • ‹ First  < 9 10 11 12 13 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Change user within ssh session retaining the current MIT cookie for X-forwarding
When you remotely log in like "ssh -X userA:host" and become a different user with "su UserB", X-forwarding will not work anymore since /home/UserB/.Xauthority does not exist. This will use UserA's information stored in .Xauthority for UserB to enable X-forwarding. Watch http://prefetch.net/blog/index.php/2008/04/05/respect-my-xauthority/ for details.

Open Remote Desktop (RDP) from command line and connect local resources
The above command will open a Remote Desktop connection from command line, authenticate using default username and password (great for virtual machines; in the exampe above it's administrator:password), create a shared folder between your machine and the other machine and configure resolution to best fit your desktop (I don't like full screen because it make the desktop panels to disappear). The command will run in the background, and expect to receive parameters. You should enter hostname or IP address as a parameter to the command, and can also override the defaults parameters with your own.

High resolution video screen recording
$ gorecord foo.mp4 I've tried all of the screen recorders available for Linux and this is easily the best. xvidcap segfaults; VNC is too much hassle. There are alternatives of this command already here that I am just too lazy to reply to. Messing with the frames per second option, -r, 25 seems to be the best. Any lower and the video will look like a flipbook, if it records at all - -r 10 won't - any faster is the same, oddly enough. Edit: CLF doesn't like my long command to add audio, so here it is in the description. $ goaddaudio() ${ $if [ $# != 3 ]; then $ echo 'goaddaudio < audio > < src video > < dst video >' $ return $ fi $ $ f=goaddaudio$RANDOM $ ffmpeg -i "$2" &> $f $ d=$( grep Duration $f | awk '{print $2}' | tr -d ',' ) && $ rm $f && $ ffmpeg -i "$1" -i "$2" -r 25 -ab 192k -ar 44100 -sameq -t $d "$3" $}

Remove job from crontab by commandline
The "-u USER" is optional if root user is used

Do quick arithmetic on numbers from STDIN with any formatting using a perl one liner.
Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

Create the oauth token required for a Twitter stream feed
This is the THIRD in a set of five commands. See my other commands for the previous two. This step creates the oauth 1.0 token as explained in http://oauth.net/core/1.0/ The token is required for a Twitter filtered stream feed (and almost all Twitter API calls) This token is simply an encrypted version of your base string. The encryption key used is your hmac. The last part of the command scans the Base64 token string for '+', '/', and '=' characters and converts them to percentage-hex escape codes. (URI-escapeing). This is also a good example of where the $() syntax of Bash command substitution fails, while the backtick form ` works - the right parenthesis in the case statement causes a syntax error if you try to use the $() syntax here. See my previous two commands step1 and step2 to see how the base string variable $b and hmac variable $hmac are generated.

count how many times a string appears in a (source code) tree
grep -o puts each occurrence in a separate line

Mount directories in different locations
Like symlinked directories, you can mount a directory at a different location. For example mounting a directory from one location in to the http root without having to make your program follow symlinks or change permissions when reading.

Lookup hostname for IP address

Find the package that installed a command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: