Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Download Entire YouTube Channel - all of a user's videos

Terminal - Download Entire YouTube Channel - all of a user's videos
yt-chanrip() { for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl --title --no-overwrites http://youtube.com/"$i"; done }
2011-01-29 05:52:25
User: m1cawber
Functions: grep
4
Download Entire YouTube Channel - all of a user's videos

create the function then run 'yt-chanrip username' to download that user's entire channel.

uses youtube-dl and the GData API. similar to http://www.commandlinefu.com/commands/view/3154/download-youtube-playlist

Alternatives

There is 1 alternative - vote for the best!

Terminal - Alternatives
yt-chanrip() { for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl --title --no-overwrites http://youtube.com/"$i"; done }

Know a better way?

If you can do better, submit your command here.

What others think

I'm new to all this and a little lost but would like to try this command. The user's name is GameShowNetwork so how would I run this command for that user? Thanks for this. :D

Comment by primetime34 190 weeks and 2 days ago

doh! this only does 25 at a time so should also be wrapped in another for loop, making it:

yt2p() { for count in 1 51 101 151 201; do for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads\?start-index="$count"\&max-results=50 | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl --title --no-overwrites http://youtube.com/"$i"; done; done }

just make sure the highest $count in the for loop is larger than the total number of vids the user has uploaded.

the grep regex could be improved to exclude false positives e.g. if the user wrote 'watch?v=SomeVideoID' in any of their video descriptions. meh, works ok for now

Comment by m1cawber 190 weeks and 2 days ago

primetime34 try:

yt-chanrip() { for count in 1 51 101 151 201 251 301; do for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads\?start-index="$count"\&max-results=50 | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl --title --no-overwrites http://youtube.com/"$i"; done; done }

then

yt-chanrip GameShowNetwork

sorry e1 for my screwups!

Comment by m1cawber 190 weeks and 2 days ago

youtube-dl doesn't always work.

I would use command #7718 instead to download the video.

Comment by RanyAlbeg 190 weeks and 2 days ago

I get the following error for every video

ERROR: unable to download video (format may not be available)

Thoughts?

Comment by primetime34 190 weeks and 1 day ago

primetime34, I'm not sure, first three vids started downloading for me on fedora14 with youtube-dl version 2010.12.09. try youtube-dl -U to update it.. otherwise you could dump all of the vid URLs to a file and try downloading them with 'youtube-dl --batch-file=FILE', or parse through them using a for loop and another download command such as #7718 as suggested above.. to dump URLs to file run:

for count in 1 51 101 151 201 251 301; do for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"GameShowNetwork"/uploads\?start-index="$count"\&max-results=50 | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do echo "http://youtube.com/"$i"" >> FILE; done; done;

good luck

Comment by m1cawber 190 weeks and 1 day ago

I've made a script little prettier:

yt-chanrip() { for count in `seq 0 50 $(curl -s http://www.youtube.com/user/$1 | grep -E "id=\"playnav-playlist-uploads-count\" value=\"[[:digit:]]+" | grep -Eo "[[:digit:]]+")`; do for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads\?start-index="$count"\&max-results=50 | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl -citw http://youtube.com/"$i"; done; done }

now it auto-fetches the total number of user videos.

Comment by unjello 150 weeks and 2 days ago

can't make it work... trying it on cygwin... do I need something besides curl, youtube-dl script, python? it does nothing....

Comment by bidomo 131 weeks and 3 days ago

unjello: your video count fetching no longer works, I updated it to use the api:

yt-chanrip() { for count in `seq 1 50 $(curl -s http://gdata.youtube.com/feeds/api/users/"$1" | grep -Eo "uploads' countHint='[[:digit:]]+'" | grep -Eo "[[:digit:]]+")`; do for i in $(curl -s http://gdata.youtube.com/feeds/api/users/"$1"/uploads\?start-index="$count"\&max-results=50 | grep -Eo "watch\?v=[^[:space:]\"\'\\]{11}" | uniq); do youtube-dl -citw http://youtube.com/"$i"; done; done }
Comment by Kapow 129 weeks and 4 days ago

now that I look at it, youtube-dl could do this on its own the whole time...

youtube-dl -citw ytuser:[USERNAME]
Comment by Kapow 129 weeks and 4 days ago

now that I look at it, youtube-dl could do this on its own the whole time...

youtube-dl -citw ytuser:[USERNAME]
Comment by Kapow 129 weeks and 4 days ago

clive could use this feature.. its what i used to dl the ones i needed the other day. I may make apost later.

Comment by bbelt16ag 64 weeks and 1 day ago

Your point of view

You must be signed in to comment.

Related sites and podcasts