Commands by penpen (13)

  • Today I learned that commandlinefu.com is not a alone in its conquest for further the knowledge about the command line. Allow me to introduce you to http://shell-fu.org/


    -9
    lynx http://shell-fu.org/
    penpen · 2009-08-20 11:47:35 5
  • Using DynDNS or a similar service not only allows access to your home machine from outside without needing to know what IP the ISP has assigned to it but it also comes in handy if you want to know your external IP address. The only purpose of the sed command is to remove the leading "host.na.me has address " part from the output. If you don't need to discard it you can simply use host $HOSTNAME


    1
    host $HOSTNAME|cut -d' ' -f4
    penpen · 2009-08-08 12:39:00 5
  • To change to $HOME in that manner you need to set a shell option. In zsh it is auto_cd, hence setopt -o auto_cd in bash4 it is autocd, hence shopt -s autocd What the option does is allow you to cd to a directory by just entering its name. This also works if the directory name is stored in a variable: www=/var/www/lighttpd; $www sends you to /var/www/lighttpd. CAUTION: If a command or function name identical to the directory name exists it takes precedence.


    2
    ~
    penpen · 2009-07-24 10:43:53 6
  • Depending on the installation only certain of these man pages are installed. 12 is left out on purpose because ISO/IEC 8859-12 does not exist. To also access those manpages that are not installed use opera (or any other browser that supports all the character sets involved) to display online versions of the manpages hosted at kernel.org: for i in $(seq 1 11) 13 14 15 16; do opera http://www.kernel.org/doc/man-pages/online/pages/man7/iso_8859-$i.7.html; done


    -2
    for i in $(seq 1 11) 13 14 15 16; do man iso-8859-$i; done
    penpen · 2009-03-31 19:40:15 5
  • Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for Spider mode enabled. Check if remote file exists. --2009-03-31 20:42:46-- http://www.example.com/ Resolving www.example.com... 208.77.188.166 Connecting to www.example.com|208.77.188.166|:80... connected. HTTP request sent, awaiting response... and the second one looks for Length: 438 [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. Show Sample Output


    7
    wget --server-response --spider http://www.example.com/
    penpen · 2009-03-31 18:49:14 15
  • Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is w3m -dump_head http://www.example.com/ Show Sample Output


    -1
    lynx -dump -head http://www.example.com/
    penpen · 2009-03-31 18:41:36 6
  • In the above example 'muspi merol' (the output of the first rev command) is sent to stderr and 'lorem ipsum' (the output of the second rev command) is sent to stdout. rev reverse lines of a file or files. This use of tee allows testing if a program correctly handles its input without using files that hold the data. Show Sample Output


    2
    rev <<< 'lorem ipsum' | tee /dev/stderr | rev
    penpen · 2009-03-31 13:12:09 7
  • rot13 maps a..mn..z (A..MN..Z) to n..za..m (n..za..m) and so does this alias.


    10
    alias rot13="tr '[A-Za-z]' '[N-ZA-Mn-za-m]'"
    penpen · 2009-03-30 19:08:49 9
  • An improved version of http://www.commandlinefu.com/commands/view/1772/simple-countdown-from-a-given-date that uses Perl to pretty-print the output. Note that the GNU-style '--no-title' option has been replaced by its one-letter counterpart '-t'. Show Sample Output


    -2
    watch -tn1 'bc<<<"`date -d'\''friday 21:00'\'' +%s`-`date +%s`"|perl -ne'\''@p=gmtime($_);printf("%dd %02d:%02d:%02d\n",@p[7,2,1,0]);'\'
    penpen · 2009-03-29 19:53:36 8
  • Here $HOME/shots must exist and have appropriate access rights and sitecopy must be correctly set up to upload new screen shots to the remote site. Example .sitecopyrc (for illustration purposes only) site shots server ftp.example.com username user password antabakadesuka local /home/penpen/shots remote public_html/shots permissions ignore The command uses scrot to create a screen shot, moves it to the screen shot directory, uploads it using screen uses xsel to copy the URL to the paste buffer (so that you can paste it with a middle click) and finally uses feh to display a preview of the screen shot. Note that $BASE stands for the base URL for the screen shots on the remote server, replace it by the actual location; in the example http://www.example.com/~user/shots would be fitting. Assign this command to a key combination or an icon in whatever panel you use. Show Sample Output


    -1
    scrot -e 'mv $f \$HOME/shots/; sitecopy -u shots; echo "\$BASE/$f" | xsel -i; feh `xsel -o`'
    penpen · 2009-03-26 12:08:39 4
  • A web server using $HOME/public_html as user directory is required, $HOME/public_html/shots must exist and have appropriate access rights and $HOSTNAME must be known to and accessible from the outside world. The command uses scrot to create a screen shot, moves it to the screen shot directory, uses xsel to copy the URL to the paste buffer (so that you can paste it with a middle click) and finally uses feh to display a preview of the screen shot. Assign this command to a key combination or an icon in whatever panel you use.


    0
    scrot -e 'mv $f \$HOME/public_html/shots/; echo "http://\$HOSTNAME/~\$USER/shots/$f" | xsel -i; feh `xsel -o`'
    penpen · 2009-03-26 11:32:09 5
  • Have netcat listen on port 8000, point browser to http://localhost:8000/ and you see the information sent. netcat terminates as soon as your browser disconnects. I tested this command on my Fedora box but linuxrawkstar pointed out that he needs to use nc -l -p 8000 instead. This depends on the netcat version you use. The additional '-p' is required by GNU netcat that for example is used by Debian but not by the OpenBSD netcat port used by my Fedora system. Show Sample Output


    2
    nc -l 8000
    penpen · 2009-03-25 23:09:38 6
  • 'watch' repeatedly (default every 2 seconds, -n 1 => every second) runs a command (here ':', a shorthand for 'true'), displays the output (here nothing) and the date and time of the last run. I thought it to be obvious but it seemingly is not: to exit use Ctrl-C.


    -2
    watch -n 1 :
    penpen · 2009-03-25 23:00:28 6

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Remove spaces from filenames - through a whole directory tree.
An example of zsh glob qualifiers.

Recursively grep for string and format output for vi(m)
This is a big time saver for me. I often grep source code and need to edit the findings. A single highlight of the mouse and middle mouse click (in gnome terminal) and I'm editing the exact line I just found. The color highlighting helps interpret the data.

Load your [git-controlled] working files into the vi arglist.
Branch name may be substituted, of course.

Function to check whether a regular file ends with a newline
tail -c 1 "$1" returns the last byte in the file. Command substitution deletes any trailing newlines, so if the file ended in a newline $(tail -c 1 "$1") is now empty, and the -z test succeeds. However, $a will also be empty for an empty file, so we add -s "$1" to check that the file has a size greater than zero. Finally, -f "$1" checks that the file is a regular file -- not a directory or a socket, etc.

Delete all but the latest 5 files, ignoring directories

scping files with streamlines compression (tar gzip)
it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file user@server:`. the "v" in the tar command can be removed for no verbosity.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Display _something_ when an X app fails
When you run an X program from a terminal you can see any errors. But when it's run from another X program (eg from a menu item, from your fluxbox 'keys' file etc) it might just die and you see nothing (except perhaps in .xsession-errors). Instead, launch it via this command and you'll see the termination status, stderr and stdout. eg: "xlaunch firefox" or "xlaunch 'echo stdout; echo stderr >&2; false'": 'echo stdout; echo stderr >&2; false' failed with error 1 STDERR: stderr STDOUT: stdout

Change host name
With sed you can replace strings on the fly.

Get the number of days in a given month and year


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: