cat video.avi.001 video.avi.002 video.avi.003 >> video.avi

Combining video file part downloaded separately using cat command


0
By: shroff
2011-07-14 09:55:22
cat

These Might Interest You

  • newly downloaded videos goyoutube random goyoutube rand This command assumes you've already downloaded some YouTube .mp4 or .flv video files via other means. Requires 'shuf', or your own stdin shuffler.


    -1
    goyoutube() { d=/path/to/videos p=$d/playlist m=$d/*.mp4 f=$d/*.flv if [ "$1" == 'rand' ]; then ls -1 $m $f | shuf >$p else ls -1t $m $f >$p fi mplayer -geometry 500x400 -playlist $p }
    meathive · 2010-04-11 18:53:49 0
  • It will just start playing and will not stop at the point that it has downloaded up to when the video starts


    0
    mplayer <(tail -fc +0 <filename>)
    gtmanfred · 2012-04-30 07:28:42 0
  • rips the audio and video stream of a movie. The two streams are stored separately.


    5
    ffmpeg -i source_movie.flv -vcodec mpeg2video target_video.m2v -acodec copy target_audio.mp3
    dcabanis · 2009-05-23 23:52:51 1
  • Download google video with wget. Or, if you wish, pass video URL to ie mplayer to view as stream. 1. VURL: replace with url. I.e. http://video.google.com/videoplay?docid=12312312312312313# 2. OUPUT_FILE : optionally change to a more suited name. This is the downloaded file. I.e. foo.flv # Improvements greatly appreciated. (close to my first linux command after ls -A :) ) Breakedown pipe by pipe: 1. wget: html from google, pass to stdout 2. grep: get the video url until thumbnailUrl (not needed) 3. grep: Strip off everything before http:// 4. sed: urldecode 5. echo: hex escapes 6. sed: stipr of tailing before thumbnailUrl 7. wget: download. Here one could use i.e. mplayer or other...


    2
    wget -qO- "VURL" | grep -o "googleplayer.swf?videoUrl\\\x3d\(.\+\)\\\x26thumbnailUrl\\\x3dhttp" | grep -o "http.\+" | sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e | sed 's/.\{22\}$//g' | xargs wget -O OUPUT_FILE
    unixmonkey14750 · 2010-12-03 17:27:08 0
  • If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem. Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading. Show Sample Output


    -1
    cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
    linuxrawkstar · 2009-08-05 13:33:06 3
  • Gives MPEG-4/DivX output video file ready for uploading to YouTube from FLV file downloaded from the site and your own subtitle file UTF-8 encoded. No resizing needed. (?)


    1
    mencoder -sub subs.ssa -utf8 -subfont-text-scale 4 -oac mp3lame -lameopts cbr=128 -ovc lavc -lavcopts vcodec=mpeg4 -ffourcc xvid -o output.avi input.flv
    ivalladt · 2009-09-12 09:24:24 1

What Others Think

I personally find this hilarious that this works.
h3xx · 353 weeks ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: