What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged download from sorted by
Terminal - Commands tagged download - 33 results
get_iplayer --type=radio --channel "Radio 4 Extra" | grep : | awk '{ if ( NR > 1 ) { print } }'|sed 's/:.*//' |sed '$ d' > pidlist && while read p; do get_iplayer --get --fields=pid $p; done <pidlist && rm pidlist
2016-01-16 17:20:54
User: dunryc
Functions: awk grep read rm sed

use get_iplay to download all listed content from http://www.bbc.co.uk/radio4extra run every night to make sure no episodes are missed

using 3 thread to download some big file
youtube-dl -tci --write-info-json "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
2014-10-13 21:18:34
User: wires

Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html).

The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz.

Paste youtube users and playlists for extra fun.

Protip: git-annex loves these files

aria2c -x 4 http://my/url
2014-07-26 03:06:33
User: lx45803

jrk's aria2 example is incorrect. -s specifies the global connection limit; the per-host connection limit is specified with -x.

ssh [email protected] cat REMOTE_FILE.mp4 | tee LOCAL_FILE.mp4 | mplayer -
2013-11-28 11:25:26
User: flatcap
Functions: cat ssh tee

Securely stream a file from a remote server (and save it locally).

Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth.

This is an extension of snipertyler's idea.

Note: This command uses an encrypted connection, unlike the original.

nc HOST PORT | tee movie.mp4 | mplayer -
2013-11-28 01:38:29
User: snipertyler
Functions: tee

Requires a listening port on HOST

eg. "cat movie.mp4 | nc -l 1356 " (cat movie.mp4 | nc -l PORT)

Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth.

You can't seek (it'll crash and kill the stream) but you can pause it.

aria2c --max-download-limit=100K file.metalink
2013-03-26 16:05:52
User: totti

Throttle download speed

aria2c --max-download-limit=100K file.metalink

Throttle upload speed

aria2c --max-upload-limit=100K file.torrent
axel --max-speed=x
2013-03-26 16:00:43
User: totti
Tags: download speed


--max-speed=x, -s x

You can specify a speed (bytes per second) here and Axel will

try to keep the average speed around this speed. Useful if you

don?t want the program to suck up all of your bandwidth.

wget -O - "[PICASA ALBUM RSS LINK]" |sed 's/</\n</g' | grep media:content |sed 's/.*url='"'"'\([^'"'"']*\)'"'"'.*$/\1/' |awk -F'/' '{gsub($NF,"d/"$NF); print $0}'|wget -i -
sudo apt-get <apt-get command and options> --print-uris -qq | sed -n "s/'\([^ ]\+\)' \([^ ]\+\) \([^ ]\+\) MD5Sum:\([^ ]\+\)/wget -c \1/p" > dowload_deb_list.txt
curl -C - -o partially_downloaded_file 'www.example.com/path/to/the/file'
for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
tpb() { wget -U Mozilla -qO - $(echo "http://thepiratebay.org/search/$@/0/7/0" | sed 's/ /\%20/g') | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent' | tac; }
2011-10-26 12:15:55
User: Bonster
Functions: echo grep sed wget

usage: tpb searchterm

example: tpb the matrix trilogy

This searches for torrents from thepiratebay and displays the top results in reverse order,

so the 1st result is at the bottom instead of the top -- which is better for command line users

wgetall () { wget -r -l2 -nd -Nc -A.$@ $@ }
2011-09-28 09:43:25
Functions: wget

Recursively download all files of a certain type down to two levels, ignoring directory structure and local duplicates.


wgetall mp3 http://example.com/download/

for i in `seq -w 1 50`; do wget --continue \ http://commandline.org.uk/images/posts/animal/$i.jpg; done
curl -s --compressed http://funnyjunk.com | awk -F'"' '/ '"'"'mainpagetop24h'"'"'/ { print "http://funnyjunk.com"$4 }' | xargs curl -s | grep -o 'ht.*m/pictures/.*\.jpg\|ht.*m/gifs/.*\.gif' | grep "_......_" | uniq | xargs wget
2011-07-21 15:57:21
User: laniner
Functions: awk uniq xargs

If your version of curl does not support the --compressed option, use

curl -s http://funnyjunk.com | gunzip

instead of

curl -s --compressed http://funnyjunk.com
wget -U Mozilla -qO - "http://thepiratebay.org/search/your_querry_here/0/7/0" | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent'
2011-04-15 15:01:16
User: sairon
Functions: grep wget

This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser).

You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable).

Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ).

I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission).

Happy downloading!

yt-pl2mp3() {umph -m 50 $1 | cclive -f mp4_720p; IFS=$(echo -en "\n\b"); for track in $(ls | grep mp4 | awk '{print $0}' | sed -e 's/\.mp4//'); do (ffmpeg -i $track.mp4 -vn -ar 44100 -ac 2 -ab 320 -f mp3 $track.mp3); done; rm -f *.mp4}
2011-03-20 14:43:20
User: sattellite
Functions: awk echo grep ls rm sed

umph is parsing video links from Youtube playlists ( http://code.google.com/p/umph/ )

cclive is downloading videos from Youtube ( http://cclive.sourceforge.net/ )


yt-pl2mp3 7AB74822FE7D03E8
for file in `cat urls.txt`; do echo -n "$file " >> log.txt; curl --head $file >> log.txt ; done
2010-10-19 02:54:13
User: Glutnix
Functions: echo file

urls.txt should have a fully qualified url on each line

prefix with

rm log.txt;

to clear the log

change curl command to

curl --head $file | head -1 >> log.txt

to just get the http status

scp $user@$server:$path/to/file .
file=ftp://ftp.gimp.org/pub/gimp/v2.6/gimp-2.6.10.tar.bz2; ssh server "wget $file -O -" > $PWD/${file##*/}
2010-08-02 15:59:45
User: michaelmior
Functions: file ssh
Tags: ssh bash download

This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.

d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
2010-05-18 07:38:54
User: lrvick
Functions: c++ egrep sed

Requires aria2c but could just as easily wget or anything else.

A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-)

zsync -i existing-file-on-disk.iso http://example.com/new-release.iso.zsync
2010-04-20 07:02:37
User: rkulla

Zsync is an implementation of rsync over HTTP that allows updating of files from a remote Web server without requiring a full download. For example, if you already have a Debian alpha, beta or RC copy downloaded, zsync can just download the updated bits of the new release of the file from the server.

This requires the distributor of the file to have created a zsync build control file (using zsyncmake).

trickle -d 60 wget http://very.big/file
2010-03-29 06:55:30
Functions: wget

Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use.

The most simple application is to limit the bandwidth usage of programs.