Commands by penpen (13)

  • Today I learned that commandlinefu.com is not a alone in its conquest for further the knowledge about the command line. Allow me to introduce you to http://shell-fu.org/


    -9
    lynx http://shell-fu.org/
    penpen · 2009-08-20 11:47:35 2
  • Using DynDNS or a similar service not only allows access to your home machine from outside without needing to know what IP the ISP has assigned to it but it also comes in handy if you want to know your external IP address. The only purpose of the sed command is to remove the leading "host.na.me has address " part from the output. If you don't need to discard it you can simply use host $HOSTNAME


    1
    host $HOSTNAME|cut -d' ' -f4
    penpen · 2009-08-08 12:39:00 2
  • To change to $HOME in that manner you need to set a shell option. In zsh it is auto_cd, hence setopt -o auto_cd in bash4 it is autocd, hence shopt -s autocd What the option does is allow you to cd to a directory by just entering its name. This also works if the directory name is stored in a variable: www=/var/www/lighttpd; $www sends you to /var/www/lighttpd. CAUTION: If a command or function name identical to the directory name exists it takes precedence.


    2
    ~
    penpen · 2009-07-24 10:43:53 3
  • Depending on the installation only certain of these man pages are installed. 12 is left out on purpose because ISO/IEC 8859-12 does not exist. To also access those manpages that are not installed use opera (or any other browser that supports all the character sets involved) to display online versions of the manpages hosted at kernel.org: for i in $(seq 1 11) 13 14 15 16; do opera http://www.kernel.org/doc/man-pages/online/pages/man7/iso_8859-$i.7.html; done


    -2
    for i in $(seq 1 11) 13 14 15 16; do man iso-8859-$i; done
    penpen · 2009-03-31 19:40:15 2
  • Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for Spider mode enabled. Check if remote file exists. --2009-03-31 20:42:46-- http://www.example.com/ Resolving www.example.com... 208.77.188.166 Connecting to www.example.com|208.77.188.166|:80... connected. HTTP request sent, awaiting response... and the second one looks for Length: 438 [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. Show Sample Output


    7
    wget --server-response --spider http://www.example.com/
    penpen · 2009-03-31 18:49:14 10
  • Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is w3m -dump_head http://www.example.com/ Show Sample Output


    -1
    lynx -dump -head http://www.example.com/
    penpen · 2009-03-31 18:41:36 1
  • In the above example 'muspi merol' (the output of the first rev command) is sent to stderr and 'lorem ipsum' (the output of the second rev command) is sent to stdout. rev reverse lines of a file or files. This use of tee allows testing if a program correctly handles its input without using files that hold the data. Show Sample Output


    2
    rev <<< 'lorem ipsum' | tee /dev/stderr | rev
    penpen · 2009-03-31 13:12:09 4
  • rot13 maps a..mn..z (A..MN..Z) to n..za..m (n..za..m) and so does this alias.


    10
    alias rot13="tr '[A-Za-z]' '[N-ZA-Mn-za-m]'"
    penpen · 2009-03-30 19:08:49 2
  • An improved version of http://www.commandlinefu.com/commands/view/1772/simple-countdown-from-a-given-date that uses Perl to pretty-print the output. Note that the GNU-style '--no-title' option has been replaced by its one-letter counterpart '-t'. Show Sample Output


    -2
    watch -tn1 'bc<<<"`date -d'\''friday 21:00'\'' +%s`-`date +%s`"|perl -ne'\''@p=gmtime($_);printf("%dd %02d:%02d:%02d\n",@p[7,2,1,0]);'\'
    penpen · 2009-03-29 19:53:36 4
  • Here $HOME/shots must exist and have appropriate access rights and sitecopy must be correctly set up to upload new screen shots to the remote site. Example .sitecopyrc (for illustration purposes only) site shots server ftp.example.com username user password antabakadesuka local /home/penpen/shots remote public_html/shots permissions ignore The command uses scrot to create a screen shot, moves it to the screen shot directory, uploads it using screen uses xsel to copy the URL to the paste buffer (so that you can paste it with a middle click) and finally uses feh to display a preview of the screen shot. Note that $BASE stands for the base URL for the screen shots on the remote server, replace it by the actual location; in the example http://www.example.com/~user/shots would be fitting. Assign this command to a key combination or an icon in whatever panel you use. Show Sample Output


    -1
    scrot -e 'mv $f \$HOME/shots/; sitecopy -u shots; echo "\$BASE/$f" | xsel -i; feh `xsel -o`'
    penpen · 2009-03-26 12:08:39 1
  • A web server using $HOME/public_html as user directory is required, $HOME/public_html/shots must exist and have appropriate access rights and $HOSTNAME must be known to and accessible from the outside world. The command uses scrot to create a screen shot, moves it to the screen shot directory, uses xsel to copy the URL to the paste buffer (so that you can paste it with a middle click) and finally uses feh to display a preview of the screen shot. Assign this command to a key combination or an icon in whatever panel you use.


    0
    scrot -e 'mv $f \$HOME/public_html/shots/; echo "http://\$HOSTNAME/~\$USER/shots/$f" | xsel -i; feh `xsel -o`'
    penpen · 2009-03-26 11:32:09 2
  • Have netcat listen on port 8000, point browser to http://localhost:8000/ and you see the information sent. netcat terminates as soon as your browser disconnects. I tested this command on my Fedora box but linuxrawkstar pointed out that he needs to use nc -l -p 8000 instead. This depends on the netcat version you use. The additional '-p' is required by GNU netcat that for example is used by Debian but not by the OpenBSD netcat port used by my Fedora system. Show Sample Output


    2
    nc -l 8000
    penpen · 2009-03-25 23:09:38 3
  • 'watch' repeatedly (default every 2 seconds, -n 1 => every second) runs a command (here ':', a shorthand for 'true'), displays the output (here nothing) and the date and time of the last run. I thought it to be obvious but it seemingly is not: to exit use Ctrl-C.


    -2
    watch -n 1 :
    penpen · 2009-03-25 23:00:28 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Watch the progress of 'dd'
The 'dd' command doesn't provide a progress when writing data. So, sending the "USR1" signal to the process will spit out its progress as it writes data. This command is superior to others on the site, as it doesn't require you to previously know the PID of the dd command.

Download Youtube video with wget!
Nothing special required, just wget, sed & tr!

Find if the command has an alias

Grep syslog today last hour
Uses date to grep de logfile for today and uses it to get the last hour logs. Can be used to get last minute logs or today's logs.

throttle bandwidth with cstream
this bzips a folder and transfers it over the network to "host" at 777k bit/s. cstream can do a lot more, have a look http://www.cons.org/cracauer/cstream.html#usage for example: $ echo w00t, i'm 733+ | cstream -b1 -t2 hehe :)

List only executables installed by a debian package
Safe for whitespaces in names.

Display IP : Count of failed login attempts
The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP. The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count. The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk. Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Find which service was used by which port number

I finally found out how to use notify-send with at or cron
The simplest way to do it. Works for me, at least. (Why are the variables being set?)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: