Commands tagged log (39)

  • This will log your internet download speed. You can run gnuplot -persist <(echo "plot 'bps' with lines") to get a graph of it.


    10
    echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
    matthewbauer · 2009-09-19 21:26:06 13
  • Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states


    6
    grep -2 -iIr "err\|warn\|fail\|crit" /var/log/*
    miketheman · 2009-06-17 19:41:04 10
  • Uses date to grep de logfile for today and uses it to get the last hour logs. Can be used to get last minute logs or today's logs. Show Sample Output


    6
    grep -i "$(date +%b\ %d\ %H)" syslog
    rubenmoran · 2010-05-23 16:18:15 9
  • Returns logs between HH:M[Mx-My], for example, between 13:40 and 13:45. Show Sample Output


    6
    grep -i "$(date +%b" "%d )13:4[0-5]" syslog
    rubenmoran · 2010-05-23 16:30:46 6
  • with discard wilcards in bash you can "tail" newer logs files to see what happen, any error, info, warn... Show Sample Output


    5
    tail -f *[!.1][!.gz]
    piscue · 2009-03-06 16:24:44 8
  • This command finds the 5 (-n5) most frequently updated logs in /var/log, and then does a multifile tail follow of those log files. Alternately, you can do this to follow a specific list of log files: sudo tail -n0 -f /var/log/{messages,secure,cron,cups/error_log} Show Sample Output


    5
    ls -drt /var/log/* | tail -n5 | xargs sudo tail -n0 -f
    kanaka · 2009-07-22 14:44:41 22
  • just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module


    4
    svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
    jemptymethod · 2009-06-05 14:07:28 11

  • 4
    zcat access_log.*.gz | awk '{print $7}' | sort | uniq -c | sort -n | tail -n 20
    tkb · 2009-12-11 09:36:30 4
  • When debugging an ssh connection either to optimize your settings ie compression, ciphers, or more commonly for debugging an issue connecting, this alias comes in real handy as it's not easy to remember the '-o LogLevel=DEBUG3' argument, which adds a boost of debugging info not available with -vvv alone. Especially useful are the FD info, and the setup negotiation to create a cleaner, faster connection. Show Sample Output


    4
    alias sshv='ssh -vvv -o LogLevel=DEBUG3'
    AskApache · 2010-10-30 11:23:52 4

  • 4
    curl http://whatthecommit.com/index.txt
    nickoe · 2011-12-29 22:35:28 3
  • This one is tried and tested for Ubuntu 12.04. Works great for tailing any file over http.


    4
    (echo -e "HTTP/1.1 200 Ok\n\r"; tail -f /var/log/syslog) | nc -l 1234
    adimania · 2013-02-09 06:15:42 5
  • This truncates any lines longer than 80 characters. Also useful for looking at different parts of the line, e.g. cut -b 50-100 shows columns 50 through 100.


    3
    tail -f logfile.log | cut -b 1-80
    plasticboy · 2009-03-26 18:41:57 11

  • 3
    tail -f /var/log/squid/access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
    godzillante · 2011-07-06 08:55:27 7
  • When you have one of those (log)files that only has epoch for time (since no one will ever look at them as a date) this is a way to get the human readable date/time and do further inspection. Mostly perl-fu :-/


    2
    perl -F' ' -MDate::Format -pale 'substr($_, index($_, $F[1]), length($F[1]), time2str("%C", $F[1]))' file.log
    coffeeaddict_nl · 2009-08-13 13:57:33 4
  • This logs the titles of the active windows, thus you can monitor what you have done during which times. (it is not hard to also log the executable name, but then it is gets too long) Show Sample Output


    2
    while true; do (echo -n $(date +"%F %T"):\ ; xwininfo -id $(xprop -root|grep "ACTIVE_WINDOW("|cut -d\ -f 5) | grep "Window id" | cut -d\" -f 2 ) >> logfile; sleep 60; done
    BeniBela · 2015-09-23 23:00:14 24
  • Really useful way to combine less and grep while browsing log files. I can't figure out how to make it into a true oneliner so paste it into a script file called lgrep: Usage: lgrep searchfor file1 [file2 file3] Advanced example (grep for an Exception in logfiles that starts with qc): lgrep Exception $(find . -name "qc*.log") Show Sample Output


    1
    argv=("$@"); rest=${argv[@]:1}; less -JMN +"/$1" `grep -l $1 $rest`
    lassel · 2009-10-16 17:36:16 3
  • This command will return a full list of Error 404 pages in the given access log. The following variables have been given to awk Hostname ($2), ERROR Code ($9), Missing Item ($7), Referrer ($11) You can then send this into a file (>> /path/to/file), which you can open with OpenOffice as a CSV


    1
    sudo awk '($9 ~ /404/)' /var/log/httpd/www.domain-access_log | awk '{print $2,$9,$7,$11}' | sort | uniq -c
    ninjasys · 2010-04-09 10:31:50 6

  • 1
    git log -p -z | perl -ln0e 'print if /[+-].*searchedstring/'
    takeshin · 2010-06-13 07:41:22 3
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    1
    goaccess -f /var/log/apache2/access.log -s -b
    allinurl · 2010-10-25 20:03:18 6
  • * Replace USERNAME with the desired svn username * Replace the first YYYY-MM-DD with the date you want to get the log (this starts at the midnight event that starts this date) * Replace the second YYYY-MM-DD with the date after you want to get the log (this will end the log scan on midnight of the previous day) Example, if I want the log for December 10, 2010, I would put {2010-12-10}:{2010-12-11} Show Sample Output


    1
    svn log -r '{YYYY-MM-DD}:{YYYY-MM-DD}' | sed -n '1p; 2,/^-/d; /USERNAME/,/^-/p' | grep -E -v '^(r[0-9]|---|$)' | sed 's/^/* /g'
    antic · 2010-12-22 17:52:19 5
  • This awk command prints a histogram of the number of times 'emergency' is the first word in a line, per day, in an irssi (IRC client) log file. Show Sample Output


    0
    awk '/^--- Day changed (.*)/ {st=""; for (i=0;i<ar[date];i++) {st=st"*"} print date" "st; date=$7"-"$5"-"$6} /> emergency/ {ar[date]++} END {st=""; for (i=0;i<ar[date];i++) {st=st"*"}; print date" "st}' #engineyard.log
    menicosia · 2010-02-24 22:54:34 4
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    0
    sed -n '/05\/Dec\/2010/,$ p' access.log | goaccess -s -b
    allinurl · 2010-12-13 17:37:33 3
  • The same with colors


    0
    tail -f /var/log/squid/access.loc | ccze -CA
    longdrink · 2011-07-15 14:58:53 5
  • If you don't have html2text Show Sample Output


    0
    curl -s 'http://whatthecommit.com/' | grep '<p>' | cut -c4-
    hputman · 2011-11-04 14:37:36 3
  • svn log -v --> takes log of all Filter1 -------- -r {from}{to} --> gives from and to revision Filter2 -------- awk of line 'r'with numbers Assign user=3rd column [ie; username] Filter3 -------- if username = George print details Filter4 -------- Print lines starts with M/U/G/C/A/D [* A Added * D Deleted * U Updated * G Merged * C Conflicted] Filter5 -------- sort all files Filter6 ------- Print only uniq file's name alone. Show Sample Output


    0
    svn log -v -r{2009-11-1}:HEAD | awk '/^r[0-9]+ / {user=$3} /./{if (user=="george") {print}}' | grep -E "^ M|^ G|^ A|^ D|^ C|^ U" | awk '{print $2}' | sort | uniq
    smilyface · 2011-12-05 07:36:44 3
  •  1 2 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Run a command only when load average is below a certain threshold
Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

Convert spaces in file names to underscores

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

list folders containing less than 2 MB of data
This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: $ while read -r line; do rm -Rv "$line"; done < textfile

Burn CD/DVD from an iso, eject disc when finished.
cdrecord -scanbus will tell you the (x,y,z) value of your cdr (for example, mine is 3,0,0)

Run a command for blocks of output of another command
The given example collects output of the tail command: Whenever a line is emitted, further lines are collected, until no more output comes for one second. This group of lines is then sent as notification to the user. You can test the example with $ logger "First group"; sleep 1; logger "Second"; logger "group"

batch convert Nikon RAW (nef) images to JPG
converts RAW files from a Nikon DSLR to jpg for easy viewing etc. requires ufraw package

a function to create a box of '=' characters around a given string.
First argument: string to put a box around. Second argument: character to use for box (default is '=') Same as command #4948, but shorter, and without the utility function.

Backup all mysql databases to individual files on a remote server
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.

use the previous commands params in the current command
Here the !!:1 will take the first parameter from the previous command. This can be used in conjunction with other history commands like ! and so on.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: