commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Find all .gz files and recompress them to bz2 on the fly. No temp files.
edit: forgot the double quotes! jeez!
If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency.
To speed up you transfers you need to compress the data so you will have less to transfer.
So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer.
Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).
Get gzip compressed web page using wget.
Caution: The command will fail in case website doesn't return gzip encoded content, though most of thw websites have gzip support now a days.
gunzip all .gz file in current dir
Display a progress bar while restoring a MySQL dump.
I've kept the gzip compression at a low level, but depending on the cpu power available on the source machine you may want to increase it. However, SQL compresses really well, and I found even with -1 I was able to transfer 40 MiB/s over a 100 mbps wire, which was good enough for me.
This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.