What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.

Top Tags



Download Youtube video with wget!

Terminal - Download Youtube video with wget!
wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
2011-01-25 04:19:06
User: Eno
Functions: sed tr wget
Download Youtube video with wget!

Nothing special required, just wget, sed & tr!


There is 1 alternative - vote for the best!

Terminal - Alternatives

Know a better way?

If you can do better, submit your command here.

What others think

oops! it's not working for me. I'm getting this error:

sed: 1: "/fmt_url_map/{s/[\'"\|] ...": extra characters at the end of p command

No URLs found in -.

Comment by kaartz 220 weeks and 3 days ago

I'm having the same problem with kaartz... I'm running OS X with the wget being ported to OS X.

Comment by TCorbin 219 weeks and 4 days ago

Weird, it's working fine on GNU/Linux, I'll check that tomorrow on OSX ;)

Comment by Eno 219 weeks and 4 days ago

the command works fine for me on Cygwin. for OS X, the sed implementation does support replacing with "\n" and hence the error. i tinkered with the command a bit and made it work on OS X using awk instead.

wget "http://www.youtube.com/watch?v=dQw4w9WgXcQ" -qO- | awk '/fmt_url_map/{gsub(/[\|\"]/,"\n");print}' | sed -n "/^fmt_url_map/,/videoplayback/p" | sed -e :a -e '$q;N;2,$D;ba' | tr -d '\n' | sed -e "s/\(.*\),\(.\)\{1,3\}/\1/;s/\\\//g" | wget -i - -O surprise.flv
Comment by ixseven 219 weeks ago

Hi, With Firefox plugin, DownloadHelper you can get the real URL. So when you have the real link you can do :

screen wget -O fileName.flv "http://v21.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Calgorithm%2Cburst%2Cfactor%2Coc%3AU0dYTVZST19FSkNNOF9OTFND&fexp=901316&algorithm=throttle-factor&itag=34&ipbits=0&burst=40&sver=3&signature=72BE5F2A3EDB50B05354CA542AE807B50C9F2CE1.87EA6771ABAB2B13E867C17E7F8894F77209DD68&expire=1298671200&key=yt1&ip="
Comment by ilanehazout 216 weeks and 1 day ago

Now it's not working since unicode symbol substitution. to get it work you should add "sed 's/\\u0026/\&/g' " at first. So that the result command is:

wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed 's/\\u0026/\&/g' | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
Comment by aikikode 212 weeks and 2 days ago

Hi all

I have tried all the command to DL a youtube vid from a playlist

Here come what work for me:

mplayer -dumpstream -dumpfile "$i.flv" $(curl -s "

Just made a loop to take all the vid from a playlist but the only problem is the time that it take because it really read the vid and put it in a file :x

Hope it will work for you guys

Comment by BlckG33k 201 weeks and 1 day ago

I tried it in RHEL 6 and got the error.

Check this.

[chankey@localhost ~]$ wget http://www.youtube.com/watch?v=Ss_MZMJ28_Y -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv

--2011-07-19 10:54:25-- http://o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Cratebypass

Resolving o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com..., 2404:6800:4004:3::c

Connecting to o-o.preferred.nrt04s02.v12.lscache7.c.youtube.com||:80... connected.

HTTP request sent, awaiting response... 400 Bad Request

2011-07-19 10:54:25 ERROR 400: Bad Request.

Comment by ChankeyPathak 195 weeks and 5 days ago


Comment by ZhaoZijie 127 weeks and 4 days ago

No working :

No URLs found in -.


i'm looking for this since 1 years :'(

Comment by kaldoran 82 weeks and 4 days ago

Your point of view

You must be signed in to comment.