commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
The J option is a recent addition to GNU tar. The xz compression utility is required as well.
polls the pirate bay mirrors list and chooses a random site and opens it for you in firefox
Can use a cookie from Rapidshare, as created by the command on http://www.commandlinefu.com/commands/view/1756/download-from-rapidshare-premium-using-wget-part-1
!! will expand to your previous command, thus creating the alias "foo" (does not work consistently for commands with quotation marks)
This particular combination of flags mimics Try CoffeeScript (on http://coffeescript.org/#try:) as closely as possible. And the `tail` call removes the comment `// Generated by CoffeeScript 1.6.3`.
See `coffee -h` for explanation of `coffee`'s flags.
This is a handy way to find which modules are loaded with Apache web server.
`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)
Today many hosts are blocking traditional ICMP echo replay for an "security" reason, so nmap's fast ARP scan is more usable to view all live IPv4 devices around you. Must be root for ARP scanning.
Calculate foldersize for each website on an ISPConfig environment. It doesn't add the jail size. Just the "public_html".
# Limited and very hacky wildcard rename
# works for rename *.ext *.other
# and for rename file.* other.*
# but fails for rename file*ext other*other and many more
# Might be good to merge this technique with mmv command...
argv="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //'`"
files="`echo \"$argv\"|sed -e \"s/ .*//\"`"
str="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //' | tr -d \*`"
set -- $str
for file in $files
echo mv $file `echo $file|sed -e "s/$1/$2/"`
mv $file `echo $file|sed -e "s/$1/$2/"`
alias rename='mv-helper #'
To get information at your fingertips about Apache compilation.
Simple Compressed Backup of the /etc
defunct processes (zombies) usually have to be killed by killing their parent processes. this command retrieves such zombies and their immediate parents and kills all of the matching processes.
Exactly the same effect with 3 less characters ;-) (Removes all files/filesystems of a harddisk. It removes EVERYTHING of your hard disk. Be careful when to select a device.)
You can press Ctrl + C after few seconds
If you have a folder with thousand of files and want to have many folder with only 100 file per folder, run this.
It will create 0/,1/ etc and put 100 file inside each one.
But find will return true even if it don't find anything ...
This option selects the listing of all Internet and x.25 (HP-UX) network files.
If you work in an environment, where some ssh hosts change regularly this might be handy...
Colors a the current date in cal output