Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using split from sorted by
Terminal - Commands using split - 14 results
base64 /dev/urandom | head -c 33554432 | split -b 8192 -da 4 - dummy.
9

Avoiding a for loop brought this time down to less than 3 seconds on my old machine. And just to be clear, 33554432 = 8192 * 4086.

split -l 12000 -a 5 database.sql splited_file i=1 for file in splited_file* do mv $file database_${i}.sql i=$(( i + 1 )) done
2013-05-15 18:17:47
User: doczine
Functions: file mv split
0

For some reason split will not let you add extension to the files you split. Just add this to a .sh script and run with bash or sh and it will split your text file at 12000 lines for each file and then add a .sql extension to the file name.

tar czf - /directory/to/tar | ccrypt -k yourpassword | split -b50m - /final/encrypted.cpt
route -n | perl -ne '$ANY="0.0.0.0"; /^$ANY/ and split /\s+/ and print "Gateway to the World: ",($_[1]!=$ANY)?$_[1]:(`ip address show $_[$#_]`=~/peer ([0-9\.]+)/ and $1),", via $_[$#_].\n"'
echo $ascii | perl -ne 'printf "%x", ord for split //'
split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split
-1

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)

split -b 4700000000 file.img.gz file.img.gz.
2010-03-18 15:42:27
Functions: split
1

Real DVD+R size is 4700372992 bytes, but I round down a little to be safe. To reconstitute use cat. "cat file.img.gz.aa file.img.gz.ab ..... > file.img.gz"

echo sortmeplease | perl -pe 'chomp; $_ = join "", sort split //'
tar cf - <dir>|split -b<max_size>M - <name>.tar.
2009-11-11 01:53:33
User: dinomite
Functions: split tar
17

Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc.

Rejoin later with `cat .tar.*|tar xf -`

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split
3

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

tar czv Pictures | split -d -a 3 -b 16M - pics.tar.gz.
2009-06-09 19:48:01
User: asmoore82
Functions: split tar
11

Leave it to a proprietary software vendor to turn a cheap and easy parlor trick into a selling point. "Hey guys, why don't we turn our _collection of multiple files_ into a *collection of multiple files*!!" Extract the ^above with this:

cat pics.tar.gz.??? | tar xzv

^extract on any Unix - no need to install junkware!

(If you must make proprietary software, at least make it do something *new*)

if [ -e windows ]; then use 7-Zip

pr -l 40 bitree.c > printcode; split -40 printcode -d page_
split -b 19m file Nameforpart
2009-02-25 15:24:06
User: vranx
Functions: file split
10

Split File in 19 MB big parts, putting parts together again via

cat Nameforpartaa Nameforpartab Nameforpartac >> File

split -b 1k file ; cat x* > file
2009-02-08 23:10:18
User: abcde
Functions: cat file split
2

`split -b 1k file` splits files into 1k chunks. Rejoin them with `cat x* > file`.