Download files linked in a RSS feed

wget -q -O- http://www.yourfeed.com/rss | grep -o "<link[ -~][^>]*" | grep -o "http://www.myfeed.com[ -~][^\"]*" | sed "s: :%20:g" | xargs wget -c
Just added a little url encoding with sed - urls with spaces don't work well - this also works against instead of enclosure and adds a sample to show that you can filter against links at a certain domain
Sample Output
Reusing existing connection to www.yourfeed.com:80.
HTTP request sent, awaiting response... 200 OK
Length: 176975 (173K) [application/pdf]
Saving to: 'MyFile.pdf'

MyFile.pdf            100%[==================================================================================================================>] 172.83K   488KB/s   in 0.4s   

2015-10-30 15:06:24 (488 KB/s) - 'MyFile.pdf' saved [176975/176975]

5 Alternatives + Submit Alt

  • The difference between the original version provided and this one is that this one works rather than outputting a wget error


    3
    curl $1 | grep -E "http.*\.mp3" | sed "s/.*\(http.*\.mp3\).*/\1/" | xargs wget
    theodric · 2015-09-17 13:19:53 7
  • Neither of the others worked for me. This does.


    1
    curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
    dakira · 2016-05-29 12:07:21 4
  • This script can be used to download enclosed files from a RSS feed. For example, it can be used to download mp3 files from a podcasts RSS feed. Show Sample Output


    0
    wget -q -O- http://example-podcast-feed.com/rss | grep -o "<enclosure[ -~][^>]*" | grep -o "http://[ -~][^\"]*" | xargs wget -c
    talha131 · 2013-09-24 12:38:08 5

  • 0
    wget `curl -s <podcast feed URL> | grep -o 'https*://[^"]*mp3' | head -1`
    tbon3r · 2017-07-16 23:02:03 3
  • Directly download all mp3 files of the desired podcast


    0
    curl http://radiofrance-podcast.net/podcast09/rss_14726.xml | grep -Eo "(http|https)://[a-zA-Z0-9./?=_%:-]*mp3" | sort -u | xargs wget
    pascalv · 2021-08-09 13:40:26 40

What Others Think

We set up a http://papercoachreviews.com">PaperCoach independent review site for students. We have nothing to hide and want learners to know about all the benefits and disadvantages of this service. Go to our platform and read numerous reviews.
allanbaudelaire · 58 weeks ago
1 <a href="https://www.google.com/">google</a> 2
allanbaudelaire · 58 weeks ago
[url=https://www.google.com/]google[/url]
allanbaudelaire · 58 weeks ago
href="https://www.google.com/
allanbaudelaire · 58 weeks ago
https://www.google.com/
allanbaudelaire · 58 weeks ago
google
allanbaudelaire · 58 weeks ago
some txt google some blabla
allanbaudelaire · 58 weeks ago
great that i found this forum. People here are great. Learned alot. Keep posting more cedarrapidsweddingphotographers.com
Killersmile · 25 weeks and 1 day ago
I really love reading your blog. It is very well drafted and easy to understand. Thanks a lot! If you have more time, please visit: cookie clicker
among · 5 weeks and 6 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: