Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using gzip from sorted by
Terminal - Commands using gzip - 51 results
cd tmp ; find . |cpio -o -H newc| gzip > ../initrd.gz
2014-09-24 14:07:54
User: akiuni
Functions: cd cpio find gzip
0

This commands compresses the "tmp" directory into an initrd file.

gzip -cd gzippedarchive.tar.gz | tar -xf -
2013-09-18 17:41:25
User: RAKK
Functions: gzip tar
Tags: gzip aix
0

This command is for UNIX OSes that have plain vanilla System V UNIX commands instead of their more functional GNU counterparts, such as IBM AIX.

tar cfp - file-to-be-archived | gzip > archive.tar.gz
export BLOCKSIZE='sudo blockdev --getsize64 /dev/sdc' && sudo dd if=/dev/sdc bs=1MB | pv -s $BLOCKSIZE | gzip -9 > USB_SD_BACKUP.img.gz
2013-02-05 18:10:25
User: hur1can3
Functions: dd export gzip sudo
2

This is a useful command to backup an sd card with relative total size for piping to pv with a progressbar

gzip -c ~/.bash_history > ~/.backup/history-save-$(date +\%d-\%m-\%y-\%T).gz
2013-01-11 17:31:07
User: tictacbum
Functions: date gzip
Tags: history backup
0

this one works on user crontab

gzip -c source.csv > source.csv.gz
2012-10-17 18:31:51
User: cfunz
Functions: gzip
Tags: gzip aix
-1

use this command to gzip the file and write to stdout and from the stdout redirect to the another file

sudo find . -name "syslog*.gz" -type f | xargs gzip -cd | grep "Mounted"
find . -type f -name '*.sh' -print | cpio -o | gzip >sh.cpio.gz
2011-12-21 21:13:29
User: djangofan
Functions: cpio find gzip
0

Archive all .sh files in a directory into a gzip archive.

ssh user@host "tar -cf - /path/to/dir" | gzip > dir.tar.gz
2011-12-14 15:54:57
User: atoponce
Functions: gzip ssh
Tags: ssh tar gzip
6

The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.

sudo dd if=/dev/block/device bs=1MB | pv -s `sudo blockdev --getsize64 /dev/block/device' | gzip -9 > output.img.gz
while read; do mysqldump $REPLY | gzip > "/backup/mysql/$REPLY.sql.gz"; done < <( mysql -e 'show databases' -s --skip-column-names )
pv file | gzip > file.gz
git archive HEAD | gzip > ~/Dropbox/archive.tar.gz
mkdir copy{1,2}; gzip -dc file.tar.gz | tee >( tar x -C copy1/ ) | tar x -C copy2/
2011-04-14 17:02:05
User: depesz
Functions: gzip mkdir tar tee
Tags: bash tee tar
-1

Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)

ssh username@remotehost 'mysqldump -u <dbusername> -p<dbpassword> <dbname> tbl_name_1 tbl_name_2 tbl_name_3 | gzip -c -' | gzip -dc - | mysql -u <localusername> -p<localdbpassword> <localdbname>
tar -c bins/ | gzip -9 | openssl enc -base64
2011-02-24 22:15:23
User: mweed
Functions: gzip tar
1

Useful when you have multiple files or binary files that you need to transfer to a different host and scp or the like is unavailable.

To unpack on the destination host copy paste output using the opposite order:

openssl enc -d -base64 | gunzip | tar -x

Terminate openssl input using ^d

Note: gzip is outside of tar because using -z in tar produces lots of extra padding.

sitepass2() {salt="this_salt";pass=`echo -n "$@"`;for i in {1..500};do pass=`echo -n $pass$salt|sha512sum`;done;echo$pass|gzip -|strings -n 1|tr -d "[:space:]"|tr -s '[:print:]' |tr '!-~' 'P-~!-O'|rev|cut -b 2-15;history -d $(($HISTCMD-1));}
2010-12-09 08:42:24
User: Soubsoub
Functions: cut gzip strings tr
Tags: Security
-4

This is a safest variation for "sitepass function" that includes a SALT over a long loop for sha512sum hash

gzip -cd file.gz | ssh user@host 'dd of=~/file'
2010-09-20 11:44:19
User: twfcc
Functions: gzip ssh
-3

It is an easy method unzip a file and copy it to remote machine. No unziped file on local hard drive

dump -0Lauf - /dev/adXsYz | gzip > /path/to/adXsYz.dump.gz
2010-07-19 00:54:40
Functions: dump gzip
2

Opens a snapshot of a live UFS2 filesystem, runs dump to generate a full filesystem backup which is run through gzip. The filesystem must support snapshots and have a .snap directory in the filesystem root.

To restore the backup, one can do

zcat /path/to/adXsYz.dump.gz | restore -rf -
pv -cN orig < foo.tar.bz2 | bzcat | pv -cN bzcat | gzip -9 | pv -cN gzip > foo.tar.gz
2010-04-16 05:21:10
User: rkulla
Functions: gzip
0

In this example we convert a .tar.bz2 file to a .tar.gz file.

If you don't have Pipe Viewer, you'll have to download it via apt-get install pv, etc.

tar pcf - home | pv -s $(du -sb home | awk '{print $1}') --rate-limit 500k | gzip > /mnt/c/home.tar.gz
2010-04-02 15:29:03
User: Sail
Functions: awk du gzip tar
1

tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.

gzip *
2010-03-29 10:58:40
User: funky
Functions: gzip
Tags: gzip
-3

Should do exactly the same - compress every file in the current directory. You can even use it recursively:

gzip -r .
mysqldump --lock-tables --opt DBNAME -u UNAME --password=PASS | gzip > OUTFILE
(svnadmin dump /path/to/repo | gzip --best > /tmp/svn-backup.gz) 2>&1 | mutt -s "SVN backup `date +\%m/\%d/\%Y`" -a /tmp/svn-backup.gz emailaddress
2010-03-08 05:49:01
User: max
Functions: dump gzip
1

Dumps a compressed svn backup to a file, and emails the files along with any messages as the body of the email

for I in $(mysql -e 'show databases' -u root --password=root -s --skip-column-names); do mysqldump -u root --password=root $I | gzip -c | ssh user@server.com "cat > /remote/$I.sql.gz"; done
2010-03-07 15:03:12
User: juliend2
Functions: gzip ssh
6

It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.