This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.
I've kept the gzip compression at a low level, but depending on the cpu power available on the source machine you may want to increase it. However, SQL compresses really well, and I found even with -1 I was able to transfer 40 MiB/s over a 100 mbps wire, which was good enough for me.
Display a progress bar while restoring a MySQL dump. Show Sample Output
Get gzip compressed web page using wget. Caution: The command will fail in case website doesn't return gzip encoded content, though most of thw websites have gzip support now a days.
Downloads Bluetack's level 1 IP blocklist in .p2p format, suitable for various Bittorrent clients.
Find all .gz files and recompress them to bz2 on the fly. No temp files. edit: forgot the double quotes! jeez!
If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency. To speed up you transfers you need to compress the data so you will have less to transfer. So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer. Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).
- recompresses all gz files to bz2 files from this point and below in the directory tree
- output shows the size of the original file, and the size of the new file. Useful.
- conceptually easier to understand than playing tricks with awk and sed.
- don't like output? Use the following line:
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
Show Sample Output
The function had to be cut down to meet the maximum command length requirements. The full version of the function is:
extract()
{
if [ -f $1 ]; then
case $1 in
*.tar.bz2) tar xvjf $1 ;;
*.tar.gz) tar xvzf $1 ;;
*.bz2) bunzip2 $1 ;;
*.rar) unrar x $1 ;;
*.gz) gunzip $1 ;;
*.tar) tar xvf $1 ;;
*.tbz2) tar xvjf $1 ;;
*.tgz) tar xvzf $1 ;;
*.zip) unzip $1 ;;
*.Z) uncompress $1 ;;
*.7z) 7z x $1 ;;
*) echo "'$1' cannot be extracted via >extract<" ;;
esac
else
echo "'$1' is not a valid file!"
fi
}
Note: This is not my original code. I came across it in a forum somewhere a while ago, and it's been such a useful addition to my .bashrc file, that I thought it worth sharing.
Show Sample Output
gunzip all .gz file in current dir
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: