Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Download Youtube video with wget!

Terminal - Download Youtube video with wget!
wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
2011-01-25 04:19:06
User: Eno
Functions: sed tr wget
37
Download Youtube video with wget!

Nothing special required, just wget, sed & tr!

Alternatives

There are 3 alternatives - vote for the best!

Terminal - Alternatives

Know a better way?

If you can do better, submit your command here.

What others think

oops! it's not working for me. I'm getting this error:

sed: 1: "/fmt_url_map/{s/[\'"\|] ...": extra characters at the end of p command

No URLs found in -.

Comment by kaartz 187 weeks and 2 days ago

I'm having the same problem with kaartz... I'm running OS X with the wget being ported to OS X.

Comment by TCorbin 186 weeks and 4 days ago

Weird, it's working fine on GNU/Linux, I'll check that tomorrow on OSX ;)

Comment by Eno 186 weeks and 3 days ago

the command works fine for me on Cygwin. for OS X, the sed implementation does support replacing with "\n" and hence the error. i tinkered with the command a bit and made it work on OS X using awk instead.

wget "http://www.youtube.com/watch?v=dQw4w9WgXcQ" -qO- | awk '/fmt_url_map/{gsub(/[\|\"]/,"\n");print}' | sed -n "/^fmt_url_map/,/videoplayback/p" | sed -e :a -e '$q;N;2,$D;ba' | tr -d '\n' | sed -e "s/\(.*\),\(.\)\{1,3\}/\1/;s/\\\//g" | wget -i - -O surprise.flv
Comment by ixseven 185 weeks and 6 days ago

Hi, With Firefox plugin, DownloadHelper you can get the real URL. So when you have the real link you can do :

screen wget -O fileName.flv "http://v21.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Calgorithm%2Cburst%2Cfactor%2Coc%3AU0dYTVZST19FSkNNOF9OTFND&fexp=901316&algorithm=throttle-factor&itag=34&ipbits=0&burst=40&sver=3&signature=72BE5F2A3EDB50B05354CA542AE807B50C9F2CE1.87EA6771ABAB2B13E867C17E7F8894F77209DD68&expire=1298671200&key=yt1&ip=0.0.0.0&factor=1.25&id=5415eea9d2ca1a35"
Comment by ilanehazout 183 weeks and 1 day ago

Now it's not working since unicode symbol substitution. to get it work you should add "sed 's/\\u0026/\&/g' " at first. So that the result command is:

wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed 's/\\u0026/\&/g' | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
Comment by aikikode 179 weeks and 2 days ago

Hi all

I have tried all the command to DL a youtube vid from a playlist

Here come what work for me:

mplayer -dumpstream -dumpfile "$i.flv" $(curl -s "

Just made a loop to take all the vid from a playlist but the only problem is the time that it take because it really read the vid and put it in a file :x

Hope it will work for you guys

Comment by BlckG33k 168 weeks ago

I tried it in RHEL 6 and got the error.

Check this.

[chankey@localhost ~]$ wget http://www.youtube.com/watch?v=Ss_MZMJ28_Y -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv

--2011-07-19 10:54:25-- http://o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Cratebypass

Resolving o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com... 74.125.171.172, 2404:6800:4004:3::c

Connecting to o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com|74.125.171.172|:80... connected.

HTTP request sent, awaiting response... 400 Bad Request

2011-07-19 10:54:25 ERROR 400: Bad Request.

Comment by ChankeyPathak 162 weeks and 4 days ago

cool

Comment by ZhaoZijie 94 weeks and 4 days ago

No working :

No URLs found in -.

:'(

i'm looking for this since 1 years :'(

Comment by kaldoran 49 weeks and 4 days ago

Your point of view

You must be signed in to comment.

Related sites and podcasts