Commands tagged log (39)

  • This will log your internet download speed. You can run gnuplot -persist <(echo "plot 'bps' with lines") to get a graph of it.


    10
    echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
    matthewbauer · 2009-09-19 21:26:06 13
  • Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states


    6
    grep -2 -iIr "err\|warn\|fail\|crit" /var/log/*
    miketheman · 2009-06-17 19:41:04 10
  • Uses date to grep de logfile for today and uses it to get the last hour logs. Can be used to get last minute logs or today's logs. Show Sample Output


    6
    grep -i "$(date +%b\ %d\ %H)" syslog
    rubenmoran · 2010-05-23 16:18:15 9
  • Returns logs between HH:M[Mx-My], for example, between 13:40 and 13:45. Show Sample Output


    6
    grep -i "$(date +%b" "%d )13:4[0-5]" syslog
    rubenmoran · 2010-05-23 16:30:46 6
  • with discard wilcards in bash you can "tail" newer logs files to see what happen, any error, info, warn... Show Sample Output


    5
    tail -f *[!.1][!.gz]
    piscue · 2009-03-06 16:24:44 8
  • This command finds the 5 (-n5) most frequently updated logs in /var/log, and then does a multifile tail follow of those log files. Alternately, you can do this to follow a specific list of log files: sudo tail -n0 -f /var/log/{messages,secure,cron,cups/error_log} Show Sample Output


    5
    ls -drt /var/log/* | tail -n5 | xargs sudo tail -n0 -f
    kanaka · 2009-07-22 14:44:41 22
  • just change the date following the -r flag, and/or the user name in the user== conditional statement, and substitute yms_web with the name of your module


    4
    svn log -v -r{2009-05-21}:HEAD | awk '/^r[0-9]+ / {user=$3} /yms_web/ {if (user=="george") {print $2}}' | sort | uniq
    jemptymethod · 2009-06-05 14:07:28 11

  • 4
    zcat access_log.*.gz | awk '{print $7}' | sort | uniq -c | sort -n | tail -n 20
    tkb · 2009-12-11 09:36:30 4
  • When debugging an ssh connection either to optimize your settings ie compression, ciphers, or more commonly for debugging an issue connecting, this alias comes in real handy as it's not easy to remember the '-o LogLevel=DEBUG3' argument, which adds a boost of debugging info not available with -vvv alone. Especially useful are the FD info, and the setup negotiation to create a cleaner, faster connection. Show Sample Output


    4
    alias sshv='ssh -vvv -o LogLevel=DEBUG3'
    AskApache · 2010-10-30 11:23:52 4

  • 4
    curl http://whatthecommit.com/index.txt
    nickoe · 2011-12-29 22:35:28 3
  • This one is tried and tested for Ubuntu 12.04. Works great for tailing any file over http.


    4
    (echo -e "HTTP/1.1 200 Ok\n\r"; tail -f /var/log/syslog) | nc -l 1234
    adimania · 2013-02-09 06:15:42 5
  • This truncates any lines longer than 80 characters. Also useful for looking at different parts of the line, e.g. cut -b 50-100 shows columns 50 through 100.


    3
    tail -f logfile.log | cut -b 1-80
    plasticboy · 2009-03-26 18:41:57 11

  • 3
    tail -f /var/log/squid/access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
    godzillante · 2011-07-06 08:55:27 7
  • When you have one of those (log)files that only has epoch for time (since no one will ever look at them as a date) this is a way to get the human readable date/time and do further inspection. Mostly perl-fu :-/


    2
    perl -F' ' -MDate::Format -pale 'substr($_, index($_, $F[1]), length($F[1]), time2str("%C", $F[1]))' file.log
    coffeeaddict_nl · 2009-08-13 13:57:33 4
  • This logs the titles of the active windows, thus you can monitor what you have done during which times. (it is not hard to also log the executable name, but then it is gets too long) Show Sample Output


    2
    while true; do (echo -n $(date +"%F %T"):\ ; xwininfo -id $(xprop -root|grep "ACTIVE_WINDOW("|cut -d\ -f 5) | grep "Window id" | cut -d\" -f 2 ) >> logfile; sleep 60; done
    BeniBela · 2015-09-23 23:00:14 24
  • Really useful way to combine less and grep while browsing log files. I can't figure out how to make it into a true oneliner so paste it into a script file called lgrep: Usage: lgrep searchfor file1 [file2 file3] Advanced example (grep for an Exception in logfiles that starts with qc): lgrep Exception $(find . -name "qc*.log") Show Sample Output


    1
    argv=("$@"); rest=${argv[@]:1}; less -JMN +"/$1" `grep -l $1 $rest`
    lassel · 2009-10-16 17:36:16 3
  • This command will return a full list of Error 404 pages in the given access log. The following variables have been given to awk Hostname ($2), ERROR Code ($9), Missing Item ($7), Referrer ($11) You can then send this into a file (>> /path/to/file), which you can open with OpenOffice as a CSV


    1
    sudo awk '($9 ~ /404/)' /var/log/httpd/www.domain-access_log | awk '{print $2,$9,$7,$11}' | sort | uniq -c
    ninjasys · 2010-04-09 10:31:50 6

  • 1
    git log -p -z | perl -ln0e 'print if /[+-].*searchedstring/'
    takeshin · 2010-06-13 07:41:22 3
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    1
    goaccess -f /var/log/apache2/access.log -s -b
    allinurl · 2010-10-25 20:03:18 6
  • * Replace USERNAME with the desired svn username * Replace the first YYYY-MM-DD with the date you want to get the log (this starts at the midnight event that starts this date) * Replace the second YYYY-MM-DD with the date after you want to get the log (this will end the log scan on midnight of the previous day) Example, if I want the log for December 10, 2010, I would put {2010-12-10}:{2010-12-11} Show Sample Output


    1
    svn log -r '{YYYY-MM-DD}:{YYYY-MM-DD}' | sed -n '1p; 2,/^-/d; /USERNAME/,/^-/p' | grep -E -v '^(r[0-9]|---|$)' | sed 's/^/* /g'
    antic · 2010-12-22 17:52:19 5
  • This awk command prints a histogram of the number of times 'emergency' is the first word in a line, per day, in an irssi (IRC client) log file. Show Sample Output


    0
    awk '/^--- Day changed (.*)/ {st=""; for (i=0;i<ar[date];i++) {st=st"*"} print date" "st; date=$7"-"$5"-"$6} /> emergency/ {ar[date]++} END {st=""; for (i=0;i<ar[date];i++) {st=st"*"}; print date" "st}' #engineyard.log
    menicosia · 2010-02-24 22:54:34 4
  • GoAccess is an open source real-time Apache web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly. http://goaccess.prosoftcorp.com/ Show Sample Output


    0
    sed -n '/05\/Dec\/2010/,$ p' access.log | goaccess -s -b
    allinurl · 2010-12-13 17:37:33 3
  • The same with colors


    0
    tail -f /var/log/squid/access.loc | ccze -CA
    longdrink · 2011-07-15 14:58:53 5
  • If you don't have html2text Show Sample Output


    0
    curl -s 'http://whatthecommit.com/' | grep '<p>' | cut -c4-
    hputman · 2011-11-04 14:37:36 3
  • svn log -v --> takes log of all Filter1 -------- -r {from}{to} --> gives from and to revision Filter2 -------- awk of line 'r'with numbers Assign user=3rd column [ie; username] Filter3 -------- if username = George print details Filter4 -------- Print lines starts with M/U/G/C/A/D [* A Added * D Deleted * U Updated * G Merged * C Conflicted] Filter5 -------- sort all files Filter6 ------- Print only uniq file's name alone. Show Sample Output


    0
    svn log -v -r{2009-11-1}:HEAD | awk '/^r[0-9]+ / {user=$3} /./{if (user=="george") {print}}' | grep -E "^ M|^ G|^ A|^ D|^ C|^ U" | awk '{print $2}' | sort | uniq
    smilyface · 2011-12-05 07:36:44 3
  •  1 2 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Replicate a directory structure dropping the files

find out how many days since given date
You can also do this for seconds, minutes, hours, etc... Can't use dates before the epoch, though.

Count number of files in subdirectories
For each directory from the current one, list the counts of files in each of these directories. Change the -maxdepth to drill down further through directories.

Adequately order the page numbers to print a booklet
Useful if you don't have at hand the ability to automatically create a booklet, but still want to. F is the number of pages to print. It *must* be a multiple of 4; append extra blank pages if needed. In evince, these are the steps to print it, adapted from https://help.gnome.org/users/evince/stable/duplex-npage.html.en : 1) Click File ▸ Print. 2) Choose the General tab. Under Range, choose Pages. Type the numbers of the pages in this order (this is what this one-liner does for you): n, 1, 2, n-1, n-2, 3, 4, n-3, n-4, 5, 6, n-5, n-6, 7, 8, n-7, n-8, 9, 10, n-9, n-10, 11, 12, n-11... ...until you have typed n-number of pages. 3) Choose the Page Setup tab. - Assuming a duplex printer: Under Layout, in the Two-side menu, select Short Edge (Flip). - If you can only print on one side, you have to print twice, one for the odd pages and one for the even pages. In the Pages per side option, select 2. In the Page ordering menu, select Left to right. 4) Click Print.

Sort files by date
Show you the list of files of current directory sorted by date youngest to oldest, remove the 'r' if you want it in the otherway.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

reverse-i-search: Search through your command line history
"What it actually shows is going to be dependent on the commands you've previously entered. When you do this, bash looks for the last command that you entered that contains the substring "ls", in my case that was "lsof ...". If the command that bash finds is what you're looking for, just hit Enter to execute it. You can also edit the command to suit your current needs before executing it (use the left and right arrow keys to move through it). If you're looking for a different command, hit Ctrl+R again to find a matching command further back in the command history. You can also continue to type a longer substring to refine the search, since searching is incremental. Note that the substring you enter is searched for throughout the command, not just at the beginning of the command." - http://www.linuxjournal.com/content/using-bash-history-more-efficiently

Get the list of local files that changed since their last upload in an S3 bucket
Can be useful to granulary flush files in a CDN after they've been changed in the S3 bucket.

print DateTimeOriginal from EXIF data for all files in folder
see output from `identify -verbose` for other keywords to filter for (e.g. date:create, exif:DateTime, EXIF:ExifOffset).

Change every instance of OLD to NEW in file FILE
Very quick way to change a word in a file. I use it all the time to change variable names in my PHP scripts (sed -i 's/$oldvar/$newvar/g' index.php)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: