Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged compression from sorted by
Terminal - Commands tagged compression - 15 results
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
2014-03-13 08:36:24
User: pdwalker
Functions: bzip2 echo gunzip rm
0

- recompresses all gz files to bz2 files from this point and below in the directory tree

- output shows the size of the original file, and the size of the new file. Useful.

- conceptually easier to understand than playing tricks with awk and sed.

- don't like output? Use the following line:

for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
find /logdir -type f -mtime +7 -print0 | xargs -0 -n 1 nice -n 20 bzip2 -9
tar --exclude-from=$excludefile -zcvp "$source" | openssl aes-128-cbc -salt -out $targetfile -k $key
2013-12-13 19:35:20
User: klausman
Functions: tar
0

Create compressed, encrypted backup from $source to $targetfile with password $key and exclude-file $excludefile

find . -type f -name "*.gz" | while read line ; do gunzip --to-stdout "$line" | bzip2 > "$(echo $line | sed 's/gz$/bz2/g')" ; done
2013-04-12 19:18:21
User: Kaurin
Functions: bzip2 find gunzip read
1

Find all .gz files and recompress them to bz2 on the fly. No temp files.

edit: forgot the double quotes! jeez!

find . -type f -name '*.gz'|awk '{print "zcat", $1, "| bzip2 -c >", $0.".tmp", "&& rename", "s/.gz.tmp/.bz2/", "*.gz.tmp", "&& rm", $0}'|bash
2013-04-11 10:17:57
User: Ztyx
Functions: awk find
-2

This solution is similar to [1] except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice).

[1] http://www.commandlinefu.com/commands/view/7009/recompress-all-.gz-files-in-current-directory-using-bzip2-running-1-job-per-cpu-core-in-parallel

XZ_OPT=-9 tar cJf tarfile.tar.xz directory
2013-03-30 06:00:39
Functions: tar
0

is preserving creation time, modification time, permission, the directory structure, etc.

als some.jar
aunpack foo.tar.bz2
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/screen -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdf
2012-02-29 21:30:20
User: DavideRiboli
Functions: gs
0

Useful if you want to reduce PDF file size using command line by ghostscript.

ssh 10.0.0.4 "gzip -c /tmp/backup.sql" |gunzip > backup.sql
2012-01-06 17:44:06
User: ultips
Functions: gunzip ssh
0

If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency.

To speed up you transfers you need to compress the data so you will have less to transfer.

So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer.

Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).

function xzv() { THREADS=`grep processor /proc/cpuinfo | wc -l`; for file in $*; do pv -s `stat -c%s $file` < $file | pxz -q -T $THREADS > $file.xz ; done; }
2011-12-14 08:22:08
User: oernii2
Functions: file wc
0

You need: pxz for the actual work (http://jnovy.fedorapeople.org/pxz/). The function could be better with better multifile and stdin/out support.

optipng -o3 *png && advpng -z -4 *png && advdef -z -4 *png
2010-05-22 23:30:21
User: tamasrepus
4

optipng and advancecomp (for the the advpng and advdef tools) are the best FOSS tools for losslessly compressing PNGs. With the above tool chain, you can cut off as much as 20% off a PNG's file size.

ssh <host> 'tar -cz /<folder>/<subfolder>' | tar -xvz
2009-11-10 20:06:47
User: polaco
Functions: ssh tar
9

This command will copy a folder tree (keeping the parent folders) through ssh. It will:

- compress the data

- stream the compressed data through ssh

- decompress the data on the local folder

This command will take no additional space on the host machine (no need to create compressed tar files, transfer it and then delete it on the host).

There is some situations (like mirroring a remote machine) where you simply cant wait for a huge time taking scp command or cant compress the data to a tarball on the host because of file system space limitation, so this command can do the job quite well.

This command performs very well mainly when a lot of data is involved in the process. If you copying a low amount of data, use scp instead (easier to type)

cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
2009-06-02 15:51:51
User: lrvick
Functions: awk cat
0

Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob?

It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt.

I ran into this today in that my only copy of a CSS file was "compressed" with no newlines.

I whipped this up and it converted back into nice human readable CSS :-)

It could be nicer, but it does the job.

rar a -m5 -v5M -R myarchive.rar /home/
2009-05-27 15:53:18
User: piovisqui
0

a - archive

m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression

-v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD

R recursive for directories, do not use it for files

It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html