commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Download Websites to 5 Level and browse offline!
-k -> convert-links (to browse offline)
-r -> recursive download
-l 5 -> level 5
example.
:-)
the google-api gives you only one translation which is sometimes insufficent. this function gives you all translations, so you can choose which one fits best.
This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.
To get a random xkcd comic use the following:
xkcdrandom() { wget -qO- http://dynamic.xkcd.com/comic/random | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; }
These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.
The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.
retrieves the html from a random command line fu page, then finds commands on the page and prints them
alternatively, pipe to bash (add "| bash" to the end) to execute the command (very risky)
edit: had to adjust to properly display the portion that replaces HTML characters (e.g. " -> ")
Wgets "whatismyip" from checkip.dyndns.org and filters out the actual IP-adress. Usefull when you quickly need to find the outward facting IP-address of your current location.
other options:
* replace md5sum with sha1sum for SHA1 checksum
* replace '>' with '| tar zx' for extracting tarball
Can use a cookie from Rapidshare, as created by the command on http://www.commandlinefu.com/commands/view/1756/download-from-rapidshare-premium-using-wget-part-1
A Quick variation to the latest commands list with the new-lines skipped. This is faster to read.
Curl is not installed by default on many common distros anymore. wget always is :)
wget -qO- ifconfig.me/ip
Uses google api to translate, you can modify the language in which translate modifying the parameter "langpair=|en", the format is language input|language output.
This is a simple command that you can run complex shell scripts via ssh. For instance if you would have to run the same process on several hundred hosts. There is no security so you have to trust the server that is sourcing this script.
A cronjob command line to email someone when a webpages homepage is updated.
kaffeine could be replaced by any player able to read mms stream
Need to have rc iso pre-downloaded before running command.
Tested with 9.10 release. Choose whatever torrent client you prefer.
extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.