commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Use the excellent sensiblepasswords.com to a generate random (yet easy-to-remember) password every second, and copy it to the clipboard. Useful for generating a list of passwords and pasting them into a spreadsheet.
This script uses "madebynathan"'s "cb" function (http://madebynathan.com/2011/10/04/a-nicer-way-to-use-xclip/); you could also replace "cb" with
xclip -selection c
Remove "while true; do" and "; done" to generate and copy only 1 password.
Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for
Spider mode enabled. Check if remote file exists.
--2009-03-31 20:42:46-- http://www.example.com/
Resolving www.example.com... 126.96.36.199
Connecting to www.example.com|188.8.131.52|:80... connected.
HTTP request sent, awaiting response...
and the second one looks for
Length: 438 [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.
Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is
w3m -dump_head http://www.example.com/
this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.
(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )