Commands by Ryangillan (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Summarize the number of open TCP connections by state
Useful for checking the number and state of TCP connections.

Random number generation within a range N, here N=10

Empty Bind9 cache
Occasionally, to force zone updating, cache flush is necessary. The use of this command is better than restart the Bind9 process.

Dump bash command history of an active shell user.

Easy Persistent SSH Connections Using Screen
Use as: $ s host1 Will ssh to remote host upon first invocation. Then use C-a d to detatch. Running "s host1" again will resume the shell session on the remote host. Only useful in LAN environment. You'd want to start the screen on the remote host over a WAN. Adapted from Hack 34 in Linux Server Hacks 2nd Addition.

Colorize svn stat
Use color escape sequences and sed to colorize the output of svn stat -u. Colors: http://www.faqs.org/docs/abs/HTML/colorizing.html svn stat characters: http://svnbook.red-bean.com/en/1.4/svn-book.html#svn.ref.svn.c.status GNU Extensions for Escapes in Regular Expressions: http://www.gnu.org/software/sed/manual/html_node/Escapes.html

bulk rename files with sed, one-liner
Far from my favorite, but works in sh and with an old sed that doesn't support '-E'

A DESTRUCTIVE command to render a drive unbootable
Overwrites the boot sector. Since this doesn't overwrite any data, you can usually recover by re-creating the partition table exactly the same as before you zeroed it. This can also help sometimes if you install a new drive in a Windows machine which can't read it.

Get all URLs from webpage via Regular Expression
Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command. --> lynx --dump "" | egrep -o "" - Must substitute it for the website path that you want to extract the URLs - Regular Expression that you wanna filter the website

Remove all unused kernels with apt-get
TIMTOWTDI


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: