Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged download from sorted by
Terminal - Commands tagged download - 31 results
youtube-dl -tci --write-info-json "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
2014-10-13 21:18:34
User: wires
1

Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html).

The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz.

Paste youtube users and playlists for extra fun.

Protip: git-annex loves these files

aria2c -x 4 http://my/url
2014-07-26 03:06:33
User: lx45803
1

jrk's aria2 example is incorrect. -s specifies the global connection limit; the per-host connection limit is specified with -x.

ssh USER@HOST cat REMOTE_FILE.mp4 | tee LOCAL_FILE.mp4 | mplayer -
2013-11-28 11:25:26
User: flatcap
Functions: cat ssh tee
6

Securely stream a file from a remote server (and save it locally).

Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth.

This is an extension of snipertyler's idea.

Note: This command uses an encrypted connection, unlike the original.

nc HOST PORT | tee movie.mp4 | mplayer -
2013-11-28 01:38:29
User: snipertyler
Functions: tee
7

Requires a listening port on HOST

eg. "cat movie.mp4 | nc -l 1356 " (cat movie.mp4 | nc -l PORT)

Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth.

You can't seek (it'll crash and kill the stream) but you can pause it.

aria2c --max-download-limit=100K file.metalink
2013-03-26 16:05:52
User: totti
0

Throttle download speed

aria2c --max-download-limit=100K file.metalink

Throttle upload speed

aria2c --max-upload-limit=100K file.torrent
axel --max-speed=x
2013-03-26 16:00:43
User: totti
Tags: download speed
2

Axel

--max-speed=x, -s x

You can specify a speed (bytes per second) here and Axel will

try to keep the average speed around this speed. Useful if you

don?t want the program to suck up all of your bandwidth.

wget -O - "[PICASA ALBUM RSS LINK]" |sed 's/</\n</g' | grep media:content |sed 's/.*url='"'"'\([^'"'"']*\)'"'"'.*$/\1/' |awk -F'/' '{gsub($NF,"d/"$NF); print $0}'|wget -i -
sudo apt-get <apt-get command and options> --print-uris -qq | sed -n "s/'\([^ ]\+\)' \([^ ]\+\) \([^ ]\+\) MD5Sum:\([^ ]\+\)/wget -c \1/p" > dowload_deb_list.txt
curl -C - -o partially_downloaded_file 'www.example.com/path/to/the/file'
for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
tpb() { wget -U Mozilla -qO - $(echo "http://thepiratebay.org/search/$@/0/7/0" | sed 's/ /\%20/g') | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent' | tac; }
2011-10-26 12:15:55
User: Bonster
Functions: echo grep sed wget
3

usage: tpb searchterm

example: tpb the matrix trilogy

This searches for torrents from thepiratebay and displays the top results in reverse order,

so the 1st result is at the bottom instead of the top -- which is better for command line users

wgetall () { wget -r -l2 -nd -Nc -A.$@ $@ }
2011-09-28 09:43:25
Functions: wget
0

Recursively download all files of a certain type down to two levels, ignoring directory structure and local duplicates.

Usage:

wgetall mp3 http://example.com/download/

for i in `seq -w 1 50`; do wget --continue \ http://commandline.org.uk/images/posts/animal/$i.jpg; done
curl -s --compressed http://funnyjunk.com | awk -F'"' '/ '"'"'mainpagetop24h'"'"'/ { print "http://funnyjunk.com"$4 }' | xargs curl -s | grep -o 'ht.*m/pictures/.*\.jpg\|ht.*m/gifs/.*\.gif' | grep "_......_" | uniq | xargs wget
2011-07-21 15:57:21
User: laniner
Functions: awk uniq xargs
0

If your version of curl does not support the --compressed option, use

curl -s http://funnyjunk.com | gunzip

instead of

curl -s --compressed http://funnyjunk.com
wget -U Mozilla -qO - "http://thepiratebay.org/search/your_querry_here/0/7/0" | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent'
2011-04-15 15:01:16
User: sairon
Functions: grep wget
3

This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser).

You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable).

Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ).

I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission).

Happy downloading!

yt-pl2mp3() {umph -m 50 $1 | cclive -f mp4_720p; IFS=$(echo -en "\n\b"); for track in $(ls | grep mp4 | awk '{print $0}' | sed -e 's/\.mp4//'); do (ffmpeg -i $track.mp4 -vn -ar 44100 -ac 2 -ab 320 -f mp3 $track.mp3); done; rm -f *.mp4}
2011-03-20 14:43:20
User: sattellite
Functions: awk echo grep ls rm sed
0

umph is parsing video links from Youtube playlists ( http://code.google.com/p/umph/ )

cclive is downloading videos from Youtube ( http://cclive.sourceforge.net/ )

Example:

yt-pl2mp3 7AB74822FE7D03E8
for file in `cat urls.txt`; do echo -n "$file " >> log.txt; curl --head $file >> log.txt ; done
2010-10-19 02:54:13
User: Glutnix
Functions: echo file
-1

urls.txt should have a fully qualified url on each line

prefix with

rm log.txt;

to clear the log

change curl command to

curl --head $file | head -1 >> log.txt

to just get the http status

scp $user@$server:$path/to/file .
file=ftp://ftp.gimp.org/pub/gimp/v2.6/gimp-2.6.10.tar.bz2; ssh server "wget $file -O -" > $PWD/${file##*/}
2010-08-02 15:59:45
User: michaelmior
Functions: file ssh
Tags: ssh bash download
2

This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.

d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
2010-05-18 07:38:54
User: lrvick
Functions: c++ egrep sed
9

Requires aria2c but could just as easily wget or anything else.

A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-)

zsync -i existing-file-on-disk.iso http://example.com/new-release.iso.zsync
2010-04-20 07:02:37
User: rkulla
2

Zsync is an implementation of rsync over HTTP that allows updating of files from a remote Web server without requiring a full download. For example, if you already have a Debian alpha, beta or RC copy downloaded, zsync can just download the updated bits of the new release of the file from the server.

This requires the distributor of the file to have created a zsync build control file (using zsyncmake).

trickle -d 60 wget http://very.big/file
2010-03-29 06:55:30
Functions: wget
7

Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use.

The most simple application is to limit the bandwidth usage of programs.

wget 'link of a Picasa WebAlbum' -O - |perl -e'while(<>){while(s/"media":{"content":\[{"url":"(.+?\.JPG)//){print "$1\n"}}' |wget -w1 -i -
curl -s -c /tmp/cookie -k -u tivo:$MAK --digest "$(curl -s -c /tmp/cookie -k -u tivo:$MAK --digest https://$tivo/nowplaying/index.html | sed 's;.*<a href="\([^"]*\)">Download MPEG-PS</a>.*;\1;' | sed 's|\&amp;|\&|')" | tivodecode -m $MAK -- - > tivo.mpg
2009-09-26 03:00:46
User: matthewbauer
Functions: sed
0

Download the last show on your TiVo DVR.

Replace $MAK with your MAK see https://www3.tivo.com/tivo-mma/showmakey.do

Replace $tivo with your TiVo's IP