Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged split from sorted by
Terminal - Commands tagged split - 14 results
FILE=file_name; CHUNK=$((64*1024*1024)); SIZE=$(stat -c "%s" $FILE); for ((i=0; i < $SIZE; i+=$CHUNK)); do losetup --find --show --offset=$i --sizelimit=$CHUNK $FILE; done
2014-10-03 13:18:19
User: flatcap
Functions: losetup stat
5

It's common to want to split up large files and the usual method is to use split(1).

If you have a 10GiB file, you'll need 10GiB of free space.

Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem).

This takes AGES.

.

The command uses a set of loop block devices to create fake chunks, but without making any changes to the file.

This means the file splitting is nearly instantaneous.

The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15).

.

Note: This isn't a drop-in replacement for using split. The results are block devices.

tar and zip won't do what you expect when given block devices.

.

These commands will work:

hexdump /dev/loop4

.

gzip -9 < /dev/loop6 > part6.gz

.

cat /dev/loop10 > /media/usb/part10.bin
convert yourdoublewideimage.jpg -crop 50%x100% +repage output.jpg
2014-01-15 15:34:41
User: pagesix1536
0

Output should be two JPG files named like "output-1.jpg" and "output-2.jpg". The convert command is part of ImageMagick so you'll need that and dependent packages installed to use it.

base64 /dev/urandom | head -c 33554432 | split -b 8192 -da 4 - dummy.
9

Avoiding a for loop brought this time down to less than 3 seconds on my old machine. And just to be clear, 33554432 = 8192 * 4086.

vim -O file1 file2
vim -d '+diffoff!' file1 file2
2012-08-30 07:51:41
User: greggster
Functions: vim
2

Use vim's diff mode to edit two or more files in one window. The '+diffoff!' turns off diff highlighting when the session is started.

Use ctrl+w + ctrl+w to switch between windows.

awk -F'\t' '{print $0 >>$5.tsv}'
2012-05-16 18:18:16
User: pykler
Functions: awk
Tags: awk split tsv
0

Will split the std input lines into files grouped by the 5th column content.

FOR /F "tokens=3* delims=[]=," %A IN ('SET ARRAY[') DO ( echo %A -- %B )
2010-08-10 12:12:27
User: Marco
Functions: echo
-2

Loops over array of a system var, splits its values and puts the values into %A, %B, %C, %D, and so on.

Create array before, like

set ARRAY[0]=test1,100

and

set ARRAY[1]=test2,200

Be sure to replace %A, %B, etc. with %%A, %%B, etc. when using this from inside of batch files.

ffmpeg -i 100_0029.MOV -ss 00:00:00 -t 00:04:00 100_0029_1.MOV
2010-08-08 23:43:28
User: nickleus
2

i have a large video file, 500+ MB, so i cant upload it to flickr, so to reduce the size i split it into 2 files. the command shows the splitting for the first file, from 0-4 minutes. ss is start time and t is duration (how long you want the output file to be).

credit goes to philc: http://ubuntuforums.org/showthread.php?t=480343

NOTE: when i made the second half of the video, i got a *lot* of lines like this:

frame= 0 fps= 0 q=0.0 size= 0kB time=10000000000.00 bitrate= 0.0kbit

just be patient, it is working =)

vim -p file1 file2 [...]
vim -o file1 file2...
2010-04-13 22:09:47
User: rkulla
Functions: vim
Tags: vim split
7

-o acts like :spit. Use -O (capital o) for side-by-side like :vsplit. Use vim -d or vimdiff if you need a diff(1) comparison.

To split gnu Screen instead of vim, use ^A S for horizontal, ^A | for vertical.

split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split
-1

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)

dd if=inputfile of=split3 bs=16m count=32 skip=64
2010-02-21 10:09:46
User: jearsh
Functions: dd
Tags: dd file split
2

bs = buffer size (basically defined the size of a "unit" used by count and skip)

count = the number of buffers to copy (16m * 32 = 1/2 gig)

skip = (32 * 2) we are grabbing piece 3...which means 2 have already been written so skip (2 * count)

i will edit this later if i can to make this all more understandable

tar cf - <dir>|split -b<max_size>M - <name>.tar.
2009-11-11 01:53:33
User: dinomite
Functions: split tar
17

Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc.

Rejoin later with `cat .tar.*|tar xf -`

rar a -m5 -v5M -R myarchive.rar /home/
2009-05-27 15:53:18
User: piovisqui
0

a - archive

m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression

-v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD

R recursive for directories, do not use it for files

It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html