Performs a mysqldump and gzip-compresses the output file with a timestamp in the resulting dump file. Inspect the file for integrity or fun with this command afterward, if you desire:
zcat mysqldump-2009-06-12-07.41.01.tgz | less
Show Sample Output
It gzip each file in a directory separately
Dumps a compressed svn backup to a file, and emails the files along with any messages as the body of the email
tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers. Show Sample Output
Useful when you have multiple files or binary files that you need to transfer to a different host and scp or the like is unavailable. To unpack on the destination host copy paste output using the opposite order: openssl enc -d -base64 | gunzip | tar -x Terminate openssl input using ^d Note: gzip is outside of tar because using -z in tar produces lots of extra padding. Show Sample Output
This version compresses the data for transport.
This command is for UNIX OSes that have plain vanilla System V UNIX commands instead of their more functional GNU counterparts, such as IBM AIX.
Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.
just better bash
Archive all .sh files in a directory into a gzip archive.
this one works on user crontab
This commands compresses the "tmp" directory into an initrd file.
Sets the size of the disk to $DISKSIZE so that the percentage readout of pv is correct. set /dev/sdb to whatever your disk is /dev/sdX. Next pipe dd to pv, then pipe pv to gzip so that you get a gzipped image file. Show Sample Output
When decompressing big files it can be nice to know how long you have to go grab coffee. Show Sample Output
error out to a different file every day if it is part of a cron entry. The backup file will overwrite older version.
Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: