Commands tagged split (15)

  • Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc. Rejoin later with `cat .tar.*|tar xf -` Show Sample Output


    17
    tar cf - <dir>|split -b<max_size>M - <name>.tar.
    dinomite · 2009-11-11 01:53:33 0
  • Avoiding a for loop brought this time down to less than 3 seconds on my old machine. And just to be clear, 33554432 = 8192 * 4086.


    10
    base64 /dev/urandom | head -c 33554432 | split -b 8192 -da 4 - dummy.
    pdxdoughnut · 2013-11-12 17:56:23 1
  • -o acts like :spit. Use -O (capital o) for side-by-side like :vsplit. Use vim -d or vimdiff if you need a diff(1) comparison. To split gnu Screen instead of vim, use ^A S for horizontal, ^A | for vertical.


    7
    vim -o file1 file2...
    rkulla · 2010-04-13 22:09:47 0
  • It's common to want to split up large files and the usual method is to use split(1). If you have a 10GiB file, you'll need 10GiB of free space. Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem). This takes AGES. . The command uses a set of loop block devices to create fake chunks, but without making any changes to the file. This means the file splitting is nearly instantaneous. The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15). . Note: This isn't a drop-in replacement for using split. The results are block devices. tar and zip won't do what you expect when given block devices. . These commands will work: hexdump /dev/loop4 . gzip -9 < /dev/loop6 > part6.gz . cat /dev/loop10 > /media/usb/part10.bin Show Sample Output


    5
    FILE=file_name; CHUNK=$((64*1024*1024)); SIZE=$(stat -c "%s" $FILE); for ((i=0; i < $SIZE; i+=$CHUNK)); do losetup --find --show --offset=$i --sizelimit=$CHUNK $FILE; done
    flatcap · 2014-10-03 13:18:19 2
  • bs = buffer size (basically defined the size of a "unit" used by count and skip) count = the number of buffers to copy (16m * 32 = 1/2 gig) skip = (32 * 2) we are grabbing piece 3...which means 2 have already been written so skip (2 * count) i will edit this later if i can to make this all more understandable


    2
    dd if=inputfile of=split3 bs=16m count=32 skip=64
    jearsh · 2010-02-21 10:09:46 0
  • i have a large video file, 500+ MB, so i cant upload it to flickr, so to reduce the size i split it into 2 files. the command shows the splitting for the first file, from 0-4 minutes. ss is start time and t is duration (how long you want the output file to be). credit goes to philc: http://ubuntuforums.org/showthread.php?t=480343 NOTE: when i made the second half of the video, i got a *lot* of lines like this: frame= 0 fps= 0 q=0.0 size= 0kB time=10000000000.00 bitrate= 0.0kbit just be patient, it is working =) Show Sample Output


    2
    ffmpeg -i 100_0029.MOV -ss 00:00:00 -t 00:04:00 100_0029_1.MOV
    nickleus · 2010-08-08 23:43:28 0
  • Use vim's diff mode to edit two or more files in one window. The '+diffoff!' turns off diff highlighting when the session is started. Use ctrl+w + ctrl+w to switch between windows. Show Sample Output


    2
    vim -d '+diffoff!' file1 file2
    greggster · 2012-08-30 07:51:41 0
  • Is this not the same


    2
    vim -O file1 file2
    trantorvega · 2012-09-06 14:52:50 0
  • a - archive m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression -v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD R recursive for directories, do not use it for files It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html


    0
    rar a -m5 -v5M -R myarchive.rar /home/
    piovisqui · 2009-05-27 15:53:18 4
  • Will split the std input lines into files grouped by the 5th column content.


    0
    awk -F'\t' '{print $0 >>$5.tsv}'
    pykler · 2012-05-16 18:18:16 0
  • Output should be two JPG files named like "output-1.jpg" and "output-2.jpg". The convert command is part of ImageMagick so you'll need that and dependent packages installed to use it.


    0
    convert yourdoublewideimage.jpg -crop 50%x100% +repage output.jpg
    pagesix1536 · 2014-01-15 15:34:41 0
  • IMPORTANT: You need Windows PowerShell to run this command - in your Windows Command Prompt, type powershell Uses sajb to start a PowerShell background job that pings an IP host every 10 seconds. Any changes in the host's Up/Down state is time-stamped and logged to a file. Date/time stamps are logged in two formats: Unix and human-readable. A while(1) loop repeats the test every 10 seconds by using the sleep command. See the Sample Output for more detail. I use this command to log Up/Down events of my Motorola SB6141 cable modem (192.168.100.1). To end the logging, close the PowerShell window or use the "exit" command. Show Sample Output


    0
    sajb {$ip="192.168.100.1";$old=0;while(1){$up=test-connection -quiet -count 1 $ip;if($up-ne$old){$s=(date -u %s).split('.')[0]+' '+(date -f s).replace('T',' ')+' '+$ip+' '+$(if($up){'Up'}else{'Down'});echo $s|out-file -a $home\ping.txt;$old=$up}sleep 10}}
    omap7777 · 2015-12-28 20:33:08 0
  • This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt. I usually encrypt the file before splitting it using openssl: openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3 To restore, simply save attachments and rejoin them using: cat file.tgz.* > output_name.tgz and if encrypted, decrypt using: openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz edit: (changed "g" to "e" for political correctness)


    -1
    split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ [email protected](E)mail.com
    tboulay · 2010-03-20 16:49:19 4
  • Open files in tabs


    -2
    vim -p file1 file2 [...]
    richard · 2010-04-15 11:40:19 0
  • Loops over array of a system var, splits its values and puts the values into %A, %B, %C, %D, and so on. Create array before, like set ARRAY[0]=test1,100 and set ARRAY[1]=test2,200 Be sure to replace %A, %B, etc. with %%A, %%B, etc. when using this from inside of batch files. Show Sample Output


    -2
    FOR /F "tokens=3* delims=[]=," %A IN ('SET ARRAY[') DO ( echo %A -- %B )
    Marco · 2010-08-10 12:12:27 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Clone or rescue a block device
If you use the logfile feature of ddrescue, the data is rescued very efficiently (only the needed blocks are read). Also you can interrupt the rescue at any time and resume it later at the same point. http://www.gnu.org/software/ddrescue/ddrescue.html

view all lines without comments.

Determine space taken by files of certain type
Just how much space are those zillions of database logs taking up ? How much will you gain on a compression rate of say 80% ? This little line gives you a good start for your calculations.

Download latest released gitlab docker container
Download latest released gitlab docker container

find all open files by named process
lists all files that are opened by processess named $processname egrep 'w.+REG' is to filter out non file listings in lsof, awk to get the filenames, and sort | uniq to remove duplciation

Read null character seperated fields from a file
Handle any bad named file which contains ",',\n,\b,\t,` etc Store the file name as null character separated list $find . -print0 >name.lst and retrieve it using $read -r -d "" Eg: $find . -print0 >name.lst; $cat name.lst| while IFS="" read -r -d "" file; $do $ls -l "$file"; $done

Sort processes by CPU Usage
Short list about top 10 processes, sorted by CPU usage

easily strace all your apache processes
This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

print crontab entries for all the users that actually have a crontab
This is how I list the crontab for all the users on a given system that actually have a crontab. You could wrap it with a function block and place it in your .profile or .bashrc for quick access. There's prolly a simpler way to do this. Discuss.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: