Commands tagged youtube-dl (13)

  • Same as other command, however uses youtube-dl internal search (thanks to qoxxxx mentioning this) It does however seem to be a little buggy and youtube-dl crashes sometimes. ## pyt 'Stairway to heaven - Led Zeppelin' pyt 'brain damage - Pink Floyd' No web browser or even X needed. Just a cli and internet connection! mplayer is pauseable and can skip ahead This may break if youtube changes their search html.


    6
    pyt() { youtube-dl -q -f bestaudio --max-downloads 1 --no-playlist --default-search ${2:-ytsearch} "$1" -o - | mplayer -vo null /dev/fd/3 3<&0 </dev/tty; }
    snipertyler · 2015-07-27 15:19:59 3
  • Streams youtube-dl video to mplayer. Usage: syt 'youtube.com/link' 'anotherlinkto.video' Uses mplayer controls


    5
    syt() { pipe=`mktemp -u`; mkfifo -m 600 "$pipe" && for i in "$@"; do youtube-dl -qo "$pipe" "$i" & mplayer "$pipe" || break; done; rm -f "$pipe"; }
    snipertyler · 2015-03-14 01:48:20 4
  • pyt 'Stairway to heaven - Led Zeppelin' pyt 'brain damage - Pink Floyd' No web browser or even X needed. Just a cli and internet connection! mplayer is pauseable and can skip ahead This may break if youtube changes their search html.


    4
    pyt() { id=$(curl -s 'https://www.youtube.com/results?search_query='$(tr ' ' + <<<"$1") | grep -om3 '"[[:alnum:]]\{11\}"' | awk NR==3 | tr -d \"); youtube-dl -q 'https://www.youtube.com/watch?v='"$id" -o - | mplayer -vo null /dev/fd/3 3<&0 </dev/tty; }
    snipertyler · 2015-07-20 05:30:27 6
  • in place of "output-filename.mp4" put the name you want the file to be named with. in place of "youtube-video-link" put the link of the Video page eg: http://www.youtube.com/watch?v=AclA-7YntvE in place of "format-number" put the number of the file format you would like How to get the "format-number" to get format number type in below command before running this command youtube-dl -F "youtube-video-link" and it will list all the available formats with the format number, like to download in 360p mp4 use the number "18" To automatically let it fetch the best quality available just remove the -f "format-number" and you are good to go. Show Sample Output


    2
    wget -O "output-filename.mp4" $( youtube-dl -g -f "format-number" "youtube-video-link" )
    unixmonkey57804 · 2013-05-19 16:25:30 2
  • Usage: ytmp3 "YTurl" "YTurl2" "YTurl3" "YTurlN" Uses the shift command to let you extract the .mp3 from as many youtube urls as you like (or wherever else youtube-dl is supported) *Requires youtube-dl Orginal chunk of code: youtube-dl -q -t --extract-audio --audio-format mp3 URL taken from here http://www.commandlinefu.com/commands/view/9701/convert-youtube-videos-to-mp3 Show Sample Output


    2
    function ytmp3() { while (($#)); do (cd ~/Music; echo "Extracting mp3 from $(youtube-dl -e $1)"; /usr/bin/youtube-dl -q -t --extract-audio --audio-format mp3 $1); shift; done ; }
    snipertyler · 2013-08-08 06:44:29 1
  • Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html). The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz. Paste youtube users and playlists for extra fun. Protip: git-annex loves these files Show Sample Output


    1
    youtube-dl -tci --write-info-json "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
    wires · 2014-10-13 21:18:34 1
  • Explanation Firstly the function checks if user gave it any input, and notifies the user if they failed to do so. If user has inputed a search string, the function will call upon youtube-dl to find url of the audio of the first matching youtube video and play that with mpv. Call function by wrapping search string in quotes: listen-to-yt "sultans of swing" You have to paste the line in your .zshrc and source .zshrc for it to work. Limitations The dependancies are youtube-dl and mpv. this oneliner is stolen from http://www.bashoneliners.com/oneliners/302/


    1
    listen-to-yt() { if [[ -z "$1" ]]; then echo "Enter a search string!"; else mpv "$(youtube-dl --default-search 'ytsearch1:' \"$1\" --get-url | tail -1)"; fi }
    emphazer · 2019-12-18 14:22:12 9
  • yt2mp3(){ for j in `seq 1 301`;do i=`curl -s gdata.youtube.com/feeds/api/users/$1/uploads\?start-index=$j\&max-results=1|grep -o "watch[^&]*"`;ffmpeg -i `wget youtube.com/$i -qO-|grep -o 'url_map"[^,]*'|sed -n '1{s_.*|__;s_\\\__g;p}'` -vn -ab 128k "`youtube-dl -e ${i#*=}`.mp3";done;} squeezed the monster (and nifty ☺) command from 7776 from 531 characters to 284 characters, but I don't see a way to get it down to 255. This is definitely a kludge!


    0
    Command in description (Your command is too long - please keep it to less than 255 characters)
    __ · 2011-02-03 08:25:42 2
  • Before you use this command you want to replace everything after the "https:" with the url of the video which you want to download. This string and it's switches will use "youtube-dl" to download the Youtube url into the directory/folder where it is called from. It will output the video using the same name as Youtube uses.


    0
    youtube-dl -c -o "%(title)s" -f 18 https://www.youtube.com/watch?v=5qSCKUCjdKg
    tg0000 · 2014-06-12 23:31:55 2

  • 0
    youtube-dl --extract-audio --audio-format mp3 <video URL>
    ale3andro · 2014-12-03 07:57:38 0
  • Then run with, play "franz ferdinand the fallen" If you're running mpv, use this function: play() { mpv --cache=4096 --cache-initial=256 <(youtube-dl -f 140 -o - ytsearch:"$1"); }


    0
    play() { mplayer -cache 4096 -cache-min 5 <(youtube-dl -f 140 -o - ytsearch:"$1"); }
    ryanmjacobs · 2014-12-23 03:31:57 0
  • The above line is the meat of the script. What I do is have a key in uzbl that puts the current URL into the clipboard (use if I am on the YouTube page) or right click a link to a YouTube page, or however you want to get the URL into the clipboard. With xbindkeys I run this from the keyboard. The script: #!/bin/bash # Get URL from command line arg if given, else use clipboard. if [[ "$1" == "" ]] ; then url=$(xclip -o) else url="$1" fi # Strip it down to remove cruft url="${url%%&feature*}" url="${url%%&list*}" url="${url%%&index*}" # optional zenity --warning --timeout=1 --title="Running mplayer" --text="$url" mplayer $(youtube-dl -f best -g "$url" 2>/dev/null) Show Sample Output


    0
    mplayer $(youtube-dl -f best -g "$url" 2>/dev/null)
    jtgd · 2015-03-16 20:54:27 0
  • Downloads the frame of given YouTube video at 8 minutes 14 seconds. Requested format is "299", which 1080p only video.


    -1
    ffmpeg -ss 8:14 -i $(youtube-dl -f 299 --get-url URL) -vframes 1 -q:v 2 out.jpg
    bugmenot · 2021-07-06 10:59:49 42

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

see who's using DOM storage a/k/a Web Storage, super cookies
Someone over at Mozilla dot Org probably said, "I know, let's create a super-duper universal replacement for browser cookies that are persistent and even more creepy and then NOT give our browser users the tools they need to monitor, read, block or selectively remove them!" . This will let you see all the DOM object users in all your firefox profiles. Feel free to toss a `| sort -u` on the end to remove dupes. . I highly recommend you treat these as "session cookies" by scripting something that deletes this sqlite database during each firefox start-up. . note: does not do anything for so-called "flash cookies"

Make a playlistfile for mpg321 or other CLI player
A short variant if you have only one directory whit only audio files in it.

How to extract 5000 records from each table in MySQL
How to extract data from one table: mysqldump --opt --where="true LIMIT 5000" dbinproduzione tabella > miodbditest_tabella.sql

Run a command multiple times with different subcommands
it's nice to be able to use the command `ls program.{h,c,cpp}`. This expands to `ls program.h program.c program.cpp`. Note: This is a text expansion, not a shell wildcard type expansion that looks at matching file names to calculate the expansion. More details at http://www.linuxjournal.com/content/bash-brace-expansion I often run multiple commands (like apt-get) one after the other with different subcommands. Just for fun this wraps the whole thing into a single line that uses brace expansion.

Windows telnet
Check if TCP port is reacheable

Find out how old a web page is
I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified. You can use curl instead if you have it installed, like this: $ curl --head -s http://osswin.sourceforge.net | grep Mod

Create a file and manipulate the date

add repeated watermark to image

Move all files between to date
In a folder with many files and folders, you want to move all files where the date is >= the file olderFilesNameToMove and

List all information about all files (in current dir)
This is a funny usage of the traditional command ls. It could be basically simplified as: $ ls -a -l Duplicating arguments is permitted: $ ls -a -l -l And this markup could be shortened as: $ ls -al Extra note: To view filesizes like a pro, pray for your God: $ ls -allah


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: