commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This command finds all the files whose status has changed between the ctime of the older and newer .
Very useful if you can see from an ls listing a block of consecutive files you want to move or delete, but can't figure out exactly the time range by date.
On systems where your home directory is shared across different machines, your bash history will be global, rather than being a separate history per machine. This setting in your .bashrc file will ensure that each machine has its own history file.
Write 200 blocks of 512k to a dummy file with dd, timing the result. The is useful as a quick test to compare the performance of different file systems.
Instead of deleting an existing symlink and then re-creating it pointing at the new location, it is possible to perform the same action with this one command.
Interesting discussion on whether this is possible to do atomically here: http://answers.google.com/answers/threadview?id=526119
On a machine behind a firewall, it's possible to pass the proxy server address in as a prefix to wget to avoid having to set it as an environment variable first.
A *.tar.gz file needs to be unzipped & then untarred. Previously I might have unzipped first with
gunzip -d file.tar.gz
and then untarred the result with
tar -xvf file.tar
(Options are extract, verbose, file)
Using the -z (decompress) option on tar avoids the use of gzip (or gunzip) first.
Additionally the -C option will specify the directory to extract to.