Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged gzip from sorted by
Terminal - Commands tagged gzip - 32 results
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
2014-03-13 08:36:24
User: pdwalker
Functions: bzip2 echo gunzip rm
0

- recompresses all gz files to bz2 files from this point and below in the directory tree

- output shows the size of the original file, and the size of the new file. Useful.

- conceptually easier to understand than playing tricks with awk and sed.

- don't like output? Use the following line:

for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
gzip -cd gzippedarchive.tar.gz | tar -xf -
2013-09-18 17:41:25
User: RAKK
Functions: gzip tar
Tags: gzip aix
0

This command is for UNIX OSes that have plain vanilla System V UNIX commands instead of their more functional GNU counterparts, such as IBM AIX.

tcpdump -i eth0 -w - | ssh forge.remotehost.com -c arcfour,blowfish-cbc -C -p 50005 "cat - | gzip > /tmp/eth0.pcap.gz"
2013-05-30 07:41:22
User: bhbmaster
Functions: ssh tcpdump
Tags: ssh tcpdump gzip
0

NOTE: When opening the files you might need to strip the very top line with notepad++ as its a mistake header

This is useful when the local machine where you need to do the packet capture with tcpdump doesn?t have enough room to save the file, where as your remote host does

tcpdump -i eth0 -w - | ssh forge.remotehost.com -c arcfour,blowfish-cbc -C -p 50005 "cat - | gzip > /tmp/eth0.pcap.gz"

Your @ PC1 doing a tcpdump of PC1s eth0 interface and its going to save the output @ PC2 who is called save.location.com to a file /tmp/eth0-to-me.pcap.gz again on PC2

More info @: http://www.kossboss.com/linuxtcpdump1

find . -type f -name "*.gz" | while read line ; do gunzip --to-stdout "$line" | bzip2 > "$(echo $line | sed 's/gz$/bz2/g')" ; done
2013-04-12 19:18:21
User: Kaurin
Functions: bzip2 find gunzip read
1

Find all .gz files and recompress them to bz2 on the fly. No temp files.

edit: forgot the double quotes! jeez!

find . -type f -name '*.gz'|awk '{print "zcat", $1, "| bzip2 -c >", $0.".tmp", "&& rename", "s/.gz.tmp/.bz2/", "*.gz.tmp", "&& rm", $0}'|bash
2013-04-11 10:17:57
User: Ztyx
Functions: awk find
-2

This solution is similar to [1] except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice).

[1] http://www.commandlinefu.com/commands/view/7009/recompress-all-.gz-files-in-current-directory-using-bzip2-running-1-job-per-cpu-core-in-parallel

gzip -c source.csv > source.csv.gz
2012-10-17 18:31:51
User: cfunz
Functions: gzip
Tags: gzip aix
-1

use this command to gzip the file and write to stdout and from the stdout redirect to the another file

als some.jar
aunpack foo.tar.bz2
GZIP="--rsyncable" tar -czf something.tgz /something
ssh 10.0.0.4 "gzip -c /tmp/backup.sql" |gunzip > backup.sql
2012-01-06 17:44:06
User: ultips
Functions: gunzip ssh
0

If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency.

To speed up you transfers you need to compress the data so you will have less to transfer.

So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer.

Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).

ssh user@host "tar -zcf - /path/to/dir" > dir.tar.gz
2011-12-16 05:48:38
User: __
Functions: ssh
Tags: ssh tar gzip
16

This improves on #9892 by compressing the directory on the remote machine so that the amount of data transferred over the network is much smaller. The command uses ssh(1) to get to a remote host, uses tar(1) to archive and compress a remote directory, prints the result to STDOUT, which is written to a local file. In other words, we are archiving and compressing a remote directory to our local box.

ssh user@host "tar -czf - /path/to/dir" > dir.tar.gz
ssh user@host "tar -cf - /path/to/dir" | gzip > dir.tar.gz
2011-12-14 15:54:57
User: atoponce
Functions: gzip ssh
Tags: ssh tar gzip
6

The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.

pv file | gzip > file.gz
if curl -s -I -H "Accept-Encoding: gzip,deflate" http://example.com/ | grep 'Content-Encoding: gzip' >/dev/null 2>&1 ; then echo Yes; else echo No;fi
curl -I -H "Accept-Encoding: gzip,deflate" http://example.org
tar -caf some_dir.tar.xz some_dir
2011-06-09 19:00:06
Functions: tar
0

the -a flag causes tar to automatically pick the right compressor to filter the archive through, based on the file extension. e.g.

"tar -xaf archive.tar.xz" is equivalent to "tar -xJf archive.tar.xz"

"tar -xaf archive.tar.gz" is equivalent to "tar -xzf archive.tar.gz"

No need to remember -z is gzip, -j is bzip2, -Z is .Z, -J is xz, and so on :)

ssh username@remotehost 'mysqldump -u <dbusername> -p<dbpassword> <dbname> tbl_name_1 tbl_name_2 tbl_name_3 | gzip -c -' | gzip -dc - | mysql -u <localusername> -p<localdbpassword> <localdbname>
wget -q -O- --header\="Accept-Encoding: gzip" <url> | gunzip > out.html
2010-11-27 22:14:42
User: ashish_0x90
Functions: gunzip wget
1

Get gzip compressed web page using wget.

Caution: The command will fail in case website doesn't return gzip encoded content, though most of thw websites have gzip support now a days.

alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$F" "$D" &>/dev/null ) & )'
2010-11-18 06:24:34
User: AskApache
Functions: alias date tar
7

This is freaking sweet!!! Here is the full alias, (I didn't want to cause display problems on commandlinefu.com's homepage):

alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); S=$SECONDS; tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$"F "$D" && logger -s "Tarred $D to $F in $(($SECONDS-$S)) seconds" ) & )'

Creates a .tgz archive of whatever directory it is run from, in the background, detached from current shell so if you logout it will still complete. Also, you can run this as many times as you want, if the archive .tgz already exists, it just moves it to a numbered backup '--backup=numbered'. The coolest part of this is the transformation performed by tar and sed so that the archive file names are automatically created, and when you extract the archive file it is completely safe thanks to the transform command.

If you archive lets say /home/tombdigger/new-stuff-to-backup/ it will create the archive /home/#home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz Then when you extract it, like tar -xvzf #home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz instead of overwriting an existing /home/tombdigger/new-stuff-to-backup/ directory, it will extract to /home/tombdigger/new-stuff-to-backup.2010-11-18/

Basically, the tar archive filename is the PWD with all '/' replaced with '#', and the date is appended to the name so that multiple archives are easily managed. This example saves all archives to your $HOME/archive-name.tgz, but I have a $BKDIR variable with my backup location for each shell user, so I just replaced HOME with BKDIR in the alias.

So when I ran this in /opt/askapache/SOURCE/lockfile-progs-0.1.11/ the archive was created at /askapache-bk/#opt#askapache#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz

Upon completion, uses the universal logger tool to output its completion to syslog and stderr (printed to your terminal), just remove that part if you don't want it, or just remove the '-s ' option from logger to keep the logs only in syslog and not on your terminal.

Here's how my syslog server recorded this..

2010-11-18T00:44:13-05:00 gravedigger.askapache.com (127.0.0.5) [user] [notice] (logger:) Tarred /opt/askapache/SOURCE/lockfile-progs-0.1.11 to /askapache-bk/tarred/#opt#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz in 4 seconds

Caveats

Really this is very robust and foolproof, the only issues I ever have with it (I've been using this for years on my web servers) is if you run it in a directory and then a file changes in that directory, you get a warning message and your archive might have a problem for the changed file. This happens when running this in a logs directory, a temp dir, etc.. That's the only issue I've ever had, really nothing more than a heads up.

Advanced:

This is a simple alias, and very useful as it works on basically every linux box with semi-current tar and GNU coreutils, bash, and sed.. But if you want to customize it or pass parameters (like a dir to backup instead of pwd), check out this function I use.. this is what I created the alias from BTW, replacing my aa_status function with logger, and adding $SECONDS runtime instead of using tar's --totals

function tarred ()

{

local GZIP='--fast' PWD=${1:-`pwd`} F=$(date +${BKDIR}/%m-%d-%g-%H%M-`sed -u 's/[\/\ ]/#/g'

[[ ! -r "$PWD" ]] && echo "Bad permissions for $PWD" 1>&2 && return 2;

( ( tar --totals --ignore-failed-read --transform "s@^${PWD%/*}@`date +${PWD%/*}.%m-%d-%g`@S" -czPf $F $PWD && aa_status "Completed Tarp of $PWD to $F" ) & )

}

#From my .bash_profile http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

gzexe name ...
2010-09-27 19:57:43
User: bogomips
Functions: gzexe
Tags: gzip
5

The gzexe utility allows you to compress executables in place and have them automatically uncompress and execute when you run them. FYI: You can compress any executable sha-bang scripts as well (py, pl, sh, tcl, etc.).

dump -0Lauf - /dev/adXsYz | gzip > /path/to/adXsYz.dump.gz
2010-07-19 00:54:40
Functions: dump gzip
2

Opens a snapshot of a live UFS2 filesystem, runs dump to generate a full filesystem backup which is run through gzip. The filesystem must support snapshots and have a .snap directory in the filesystem root.

To restore the backup, one can do

zcat /path/to/adXsYz.dump.gz | restore -rf -
gzip *
2010-03-29 10:58:40
User: funky
Functions: gzip
Tags: gzip
-3

Should do exactly the same - compress every file in the current directory. You can even use it recursively:

gzip -r .
zcat database.sql.gz | mysql -uroot -p'passwd' database
2010-03-23 12:41:57
User: rubenmoran
Functions: zcat
Tags: mysql gzip zcat
5

This way you keep the file compressed saving disk space.

Other way less optimal using named pipes:

mysql -uroot -p'passwd' database <

for I in $(mysql -e 'show databases' -u root --password=root -s --skip-column-names); do mysqldump -u root --password=root $I | gzip -c | ssh user@server.com "cat > /remote/$I.sql.gz"; done
2010-03-07 15:03:12
User: juliend2
Functions: gzip ssh
6

It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.