Commands by kevinquinnyo (3)

  • i'm using gawk, you may get varying mileage with other varieties. You might want to change the / after du to say, /home/ or /var or something, otherwise this command might take quite some time to complete. Sorry it's so obsfucated, I had to turn a script into a one-liner under 255 characters for commandlinefu. Note: the bar ratio is relative, so the highest ratio of the total disk, "anchors" the rest of the graph. EDIT: the math was slightly wrong, fixed it. Also, made it compliant with older versions of df. Show Sample Output


    13
    t=$(df|awk 'NR!=1{sum+=$2}END{print sum}');sudo du / --max-depth=1|sed '$d'|sort -rn -k1 | awk -v t=$t 'OFMT="%d" {M=64; for (a=0;a<$1;a++){if (a>c){c=a}}br=a/c;b=M*br;for(x=0;x<b;x++){printf "\033[1;31m" "|" "\033[0m"}print " "$2" "(a/t*100)"% total"}'
    kevinquinnyo · 2011-12-01 01:21:11 9
  • creates associative array from apache logs, assumes "combined" log format or similar. replace awk column to suit needs. bandwidth per ip is also useful. have fun. I haven't found a more efficient way to do this as yet. sorry, FIXED TYPO: log file should obviously go after awk, which then pipes into sort.


    1
    awk '{array[$1]++}END{ for (ip in array) print array[ip] " " ip}' <path/to/apache/*.log> | sort -n
    kevinquinnyo · 2011-11-22 03:38:21 2
  • This is good for cleaning up log files without having to erase the entire contents of the file, and allows you to keep the most recent entries to the log only


    -1
    sed -i 's/`head -n 500 foo.log`//' foo.log
    kevinquinnyo · 2011-05-23 09:41:35 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Remove a range of lines from a file

display emerge.log date in a human friendly way
gentoo only or gentoo-like linux distributions.

Copy sparse files
This causes cp to detect and omit large blocks of nulls. Sparse files are useful for implying a lot of disk space without actually having to write it all out. http://en.wikipedia.org/wiki/Sparse_file You can use it in a pipe too: $ dd if=/dev/zero bs=1M count=5 |cp --sparse=always /dev/stdin SPARSE_FILE

Most Commonly Used Grep Options
This is very helpful to place in a shell startup file and will make grep use those options all the time. This example is nice as it won't show those warning messages, skips devices like fifos and pipes, and ignores case by default.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

List the most recent dates in reverse-chronological order

Command to logout all the users in one command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: