Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using gunzip from sorted by
Terminal - Commands using gunzip - 11 results
gunzip -c x.txt.gz >x.txt
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
2014-03-13 08:36:24
User: pdwalker
Functions: bzip2 echo gunzip rm
0

- recompresses all gz files to bz2 files from this point and below in the directory tree

- output shows the size of the original file, and the size of the new file. Useful.

- conceptually easier to understand than playing tricks with awk and sed.

- don't like output? Use the following line:

for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
find . -type f -name "*.gz" | while read line ; do gunzip --to-stdout "$line" | bzip2 > "$(echo $line | sed 's/gz$/bz2/g')" ; done
2013-04-12 19:18:21
User: Kaurin
Functions: bzip2 find gunzip read
1

Find all .gz files and recompress them to bz2 on the fly. No temp files.

edit: forgot the double quotes! jeez!

ssh 10.0.0.4 "gzip -c /tmp/backup.sql" |gunzip > backup.sql
2012-01-06 17:44:06
User: ultips
Functions: gunzip ssh
0

If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency.

To speed up you transfers you need to compress the data so you will have less to transfer.

So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer.

Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).

wget -q -O- --header\="Accept-Encoding: gzip" <url> | gunzip > out.html
2010-11-27 22:14:42
User: ashish_0x90
Functions: gunzip wget
2

Get gzip compressed web page using wget.

Caution: The command will fail in case website doesn't return gzip encoded content, though most of thw websites have gzip support now a days.

ls | grep .gz >> list.txt && cat list.txt | while read x ; do gunzip -d $x ; done && rm -rf list.txt
pv bigdump.sql.gz | gunzip | mysql
2010-06-11 14:26:13
User: skygreg
Functions: gunzip
3

Display a progress bar while restoring a MySQL dump.

ssh 10.0.0.4 "cat /tmp/backup.sql | gzip -c1" | gunzip -c > backup.sql
2010-02-14 19:09:07
User: kennethjor
Functions: gunzip ssh
3

I've kept the gzip compression at a low level, but depending on the cpu power available on the source machine you may want to increase it. However, SQL compresses really well, and I found even with -1 I was able to transfer 40 MiB/s over a 100 mbps wire, which was good enough for me.

ssh user@host "mysqldump -h localhost -u mysqluser -pP@$$W3rD databasename | gzip -cf" | gunzip -c > database.sql
2009-10-05 00:57:51
User: daws
Functions: gunzip ssh
8

This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.

gunzip < foo.tar.gz | tar xvf -
gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less