Commands tagged log (39)

  • This will log your internet download speed. You can run gnuplot -persist <(echo "plot 'bps' with lines") to get a graph of it.


    9
    echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
    matthewbauer · 2009-09-19 21:26:06 0
  • Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states


    6
    grep -2 -iIr "err\|warn\|fail\|crit" /var/log/*
    miketheman · 2009-06-17 19:41:04 2
  • Uses date to grep de logfile for today and uses it to get the last hour logs. Can be used to get last minute logs or today's logs. Show Sample Output


    6
    grep -i "$(date +%b\ %d\ %H)" syslog
    rubenmoran · 2010-05-23 16:18:15 4
  • Returns logs between HH:M[Mx-My], for example, between 13:40 and 13:45. Show Sample Output


    6
    grep -i "$(date +%b" "%d )13:4[0-5]" syslog
    rubenmoran · 2010-05-23 16:30:46 3
  • with discard wilcards in bash you can "tail" newer logs files to see what happen, any error, info, warn... Show Sample Output


    5
    tail -f *[!.1][!.gz]
    piscue · 2009-03-06 16:24:44 3
  • This command finds the 5 (-n5) most frequently updated logs in /var/log, and then does a multifile tail follow of those log files. Alternately, you can do this to follow a specific list of log files: sudo tail -n0 -f /var/log/{messages,secure,cron,cups/error_log} Show Sample Output


    5
    ls -drt /var/log/* | tail -n5 | xargs sudo tail -n0 -f
    kanaka · 2009-07-22 14:44:41 0
  • just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module


    4
    svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
    jemptymethod · 2009-06-05 14:07:28 3

  • 4
    zcat access_log.*.gz | awk '{print $7}' | sort | uniq -c | sort -n | tail -n 20
    tkb · 2009-12-11 09:36:30 0
  • When debugging an ssh connection either to optimize your settings ie compression, ciphers, or more commonly for debugging an issue connecting, this alias comes in real handy as it's not easy to remember the '-o LogLevel=DEBUG3' argument, which adds a boost of debugging info not available with -vvv alone. Especially useful are the FD info, and the setup negotiation to create a cleaner, faster connection. Show Sample Output


    4
    alias sshv='ssh -vvv -o LogLevel=DEBUG3'
    AskApache · 2010-10-30 11:23:52 0

  • 4
    curl http://whatthecommit.com/index.txt
    nickoe · 2011-12-29 22:35:28 0
  • This one is tried and tested for Ubuntu 12.04. Works great for tailing any file over http.


    4
    (echo -e "HTTP/1.1 200 Ok\n\r"; tail -f /var/log/syslog) | nc -l 1234
    adimania · 2013-02-09 06:15:42 0
  • This truncates any lines longer than 80 characters. Also useful for looking at different parts of the line, e.g. cut -b 50-100 shows columns 50 through 100.


    3
    tail -f logfile.log | cut -b 1-80
    plasticboy · 2009-03-26 18:41:57 1

  • 3
    tail -f /var/log/squid/access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
    godzillante · 2011-07-06 08:55:27 3
  • When you have one of those (log)files that only has epoch for time (since no one will ever look at them as a date) this is a way to get the human readable date/time and do further inspection. Mostly perl-fu :-/


    2
    perl -F' ' -MDate::Format -pale 'substr($_, index($_, $F[1]), length($F[1]), time2str("%C", $F[1]))' file.log
    coffeeaddict_nl · 2009-08-13 13:57:33 0
  • This logs the titles of the active windows, thus you can monitor what you have done during which times. (it is not hard to also log the executable name, but then it is gets too long) Show Sample Output


    2
    while true; do (echo -n $(date +"%F %T"):\ ; xwininfo -id $(xprop -root|grep "ACTIVE_WINDOW("|cut -d\ -f 5) | grep "Window id" | cut -d\" -f 2 ) >> logfile; sleep 60; done
    BeniBela · 2015-09-23 23:00:14 13
  • Really useful way to combine less and grep while browsing log files. I can't figure out how to make it into a true oneliner so paste it into a script file called lgrep: Usage: lgrep searchfor file1 [file2 file3] Advanced example (grep for an Exception in logfiles that starts with qc): lgrep Exception $(find . -name "qc*.log") Show Sample Output


    1
    argv=("$@"); rest=${argv[@]:1}; less -JMN +"/$1" `grep -l $1 $rest`
    lassel · 2009-10-16 17:36:16 0
  • This command will return a full list of Error 404 pages in the given access log. The following variables have been given to awk Hostname ($2), ERROR Code ($9), Missing Item ($7), Referrer ($11) You can then send this into a file (>> /path/to/file), which you can open with OpenOffice as a CSV


    1
    sudo awk '($9 ~ /404/)' /var/log/httpd/www.domain-access_log | awk '{print $2,$9,$7,$11}' | sort | uniq -c
    ninjasys · 2010-04-09 10:31:50 3

  • 1
    git log -p -z | perl -ln0e 'print if /[+-].*searchedstring/'
    takeshin · 2010-06-13 07:41:22 0
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    1
    goaccess -f /var/log/apache2/access.log -s -b
    allinurl · 2010-10-25 20:03:18 1
  • * Replace USERNAME with the desired svn username * Replace the first YYYY-MM-DD with the date you want to get the log (this starts at the midnight event that starts this date) * Replace the second YYYY-MM-DD with the date after you want to get the log (this will end the log scan on midnight of the previous day) Example, if I want the log for December 10, 2010, I would put {2010-12-10}:{2010-12-11} Show Sample Output


    1
    svn log -r '{YYYY-MM-DD}:{YYYY-MM-DD}' | sed -n '1p; 2,/^-/d; /USERNAME/,/^-/p' | grep -E -v '^(r[0-9]|---|$)' | sed 's/^/* /g'
    antic · 2010-12-22 17:52:19 1
  • This awk command prints a histogram of the number of times 'emergency' is the first word in a line, per day, in an irssi (IRC client) log file. Show Sample Output


    0
    awk '/^--- Day changed (.*)/ {st=""; for (i=0;i<ar[date];i++) {st=st"*"} print date" "st; date=$7"-"$5"-"$6} /> emergency/ {ar[date]++} END {st=""; for (i=0;i<ar[date];i++) {st=st"*"}; print date" "st}' #engineyard.log
    menicosia · 2010-02-24 22:54:34 1
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    0
    sed -n '/05\/Dec\/2010/,$ p' access.log | goaccess -s -b
    allinurl · 2010-12-13 17:37:33 0
  • The same with colors


    0
    tail -f /var/log/squid/access.loc | ccze -CA
    longdrink · 2011-07-15 14:58:53 2
  • If you don't have html2text Show Sample Output


    0
    curl -s 'http://whatthecommit.com/' | grep '<p>' | cut -c4-
    hputman · 2011-11-04 14:37:36 0
  • svn log -v --> takes log of all Filter1 -------- -r {from}{to} --> gives from and to revision Filter2 -------- awk of line 'r'with numbers Assign user=3rd column [ie; username] Filter3 -------- if username = George print details Filter4 -------- Print lines starts with M/U/G/C/A/D [* A Added * D Deleted * U Updated * G Merged * C Conflicted] Filter5 -------- sort all files Filter6 ------- Print only uniq file's name alone. Show Sample Output


    0
    svn log -v -r{2009-11-1}:HEAD | awk '/^r[0-9]+ / {user=$3} /./{if (user=="george") {print}}' | grep -E "^ M|^ G|^ A|^ D|^ C|^ U" | awk '{print $2}' | sort | uniq
    smilyface · 2011-12-05 07:36:44 0
  •  1 2 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

top svn committers (without awk)
list top committers (and number of their commits) of svn repository. in this example it counts revisions of current directory.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Simulate slow network connection locally
Replace 500ms by the desired delay. To remove it: sudo tc qdisc del dev lo root netem delay 500ms

Enable cd by variable names
Usage: $ mydir=/very/long/path/to/a/dir $ cd mydir I often need to cd where no man wants to go (i.e. long path). by enabling the shell option cdable_vars, I can tell cd to assume the destination is the name of a variable.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Split a file one piece at a time, when using the split command isn't an option (not enough disk space)
bs = buffer size (basically defined the size of a "unit" used by count and skip) count = the number of buffers to copy (16m * 32 = 1/2 gig) skip = (32 * 2) we are grabbing piece 3...which means 2 have already been written so skip (2 * count) i will edit this later if i can to make this all more understandable

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

find the 10 latest (modified) files
order the files by modification (thanks stanishjohnd) time, one file per output line and filter first 10

move contents of the current directory to the parent directory, then remove current directory.
Robust means of moving all files up by a directory. Will handle dot files, filenames containing spaces, and filenames with almost any printable characters. Will not handle filenames containing a single-quote (but if you are moving those, it's time to go yell at whoever created them in the first place).

Compare two directory trees.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: