Download Youtube video with wget!

wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
Nothing special required, just wget, sed & tr!
Sample Output
Resolving v3.lscache3.c.youtube.com... 74.125.170.46
Connecting to v3.lscache3.c.youtube.com|74.125.170.46|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 25120289 (24M) [video/x-flv]
Saving to: `surprise.flv'

35
By: Eno
2011-01-25 04:19:06

What Others Think

oops! it's not working for me. I'm getting this error: sed: 1: "/fmt_url_map/{s/[\'"\|] ...": extra characters at the end of p command No URLs found in -.
kaartz · 393 weeks and 6 days ago
I'm having the same problem with kaartz... I'm running OS X with the wget being ported to OS X.
TCorbin · 393 weeks and 1 day ago
Weird, it's working fine on GNU/Linux, I'll check that tomorrow on OSX ;)
Eno · 393 weeks ago
the command works fine for me on Cygwin. for OS X, the sed implementation does support replacing with "\n" and hence the error. i tinkered with the command a bit and made it work on OS X using awk instead. wget "http://www.youtube.com/watch?v=dQw4w9WgXcQ" -qO- | awk '/fmt_url_map/{gsub(/[\|\"]/,"\n");print}' | sed -n "/^fmt_url_map/,/videoplayback/p" | sed -e :a -e '$q;N;2,$D;ba' | tr -d '\n' | sed -e "s/\(.*\),\(.\)\{1,3\}/\1/;s/\\\//g" | wget -i - -O surprise.flv
ixseven · 392 weeks and 3 days ago
Hi, With Firefox plugin, DownloadHelper you can get the real URL. So when you have the real link you can do : screen wget -O fileName.flv "http://v21.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Calgorithm%2Cburst%2Cfactor%2Coc%3AU0dYTVZST19FSkNNOF9OTFND&fexp=901316&algorithm=throttle-factor&itag=34&ipbits=0&burst=40&sver=3&signature=72BE5F2A3EDB50B05354CA542AE807B50C9F2CE1.87EA6771ABAB2B13E867C17E7F8894F77209DD68&expire=1298671200&key=yt1&ip=0.0.0.0&factor=1.25&id=5415eea9d2ca1a35"
ilanehazout · 389 weeks and 5 days ago
Now it's not working since unicode symbol substitution. to get it work you should add "sed 's/\\u0026/\&/g' " at first. So that the result command is: wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed 's/\\u0026/\&/g' | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
aikikode · 385 weeks and 6 days ago
Hi all I have tried all the command to DL a youtube vid from a playlist Here come what work for me: mplayer -dumpstream -dumpfile "$i.flv" $(curl -s " Just made a loop to take all the vid from a playlist but the only problem is the time that it take because it really read the vid and put it in a file :x Hope it will work for you guys
BlckG33k · 374 weeks and 4 days ago
I tried it in RHEL 6 and got the error. Check this. [chankey@localhost ~]$ wget http://www.youtube.com/watch?v=Ss_MZMJ28_Y -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv --2011-07-19 10:54:25-- http://o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Cratebypass Resolving o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com... 74.125.171.172, 2404:6800:4004:3::c Connecting to o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com|74.125.171.172|:80... connected. HTTP request sent, awaiting response... 400 Bad Request 2011-07-19 10:54:25 ERROR 400: Bad Request.
ChankeyPathak · 369 weeks and 1 day ago
cool
ZhaoZijie · 301 weeks and 1 day ago
No working : No URLs found in -. :'( i'm looking for this since 1 years :'(
kaldoran · 256 weeks and 1 day ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: