Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged download from sorted by
Terminal - Commands tagged download - 31 results
echo 'Enter Picasa album RSS URL:"; read -e feedurl; GET "$feedurl" |sed 's/</\n</g' | grep media:content |sed 's/.*url='"'"'\([^'"'"']*\)'"'"'.*$/\1/' > wgetlist
2009-09-22 10:51:08
User: kamathln
Functions: echo
0

Grab the RSS link to the Picasa album. Feed it to the script when its hungry. When its done writing the shopping list, just use

wget -c -i wgetlist

to get your stuff.

echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
2009-09-19 21:26:06
User: matthewbauer
Functions: date echo sleep
9

This will log your internet download speed.

You can run

gnuplot -persist <(echo "plot 'bps' with lines")

to get a graph of it.

while [ -n "`pgrep wget`" ]; do sleep 2 ;done; [ -e "/tmp/nosleep"] || echo mem >/sys/power/state
2009-09-06 05:51:20
User: kamathln
Functions: echo sleep
1

[Note: This command needs to be run as root].

If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.

y=http://www.youtube.com;for i in $(curl -s $f|grep -o "url='$y/watch?v=[^']*'");do d=$(echo $i|sed "s|url\='$y/watch?v=\(.*\)&.*'|\1|");wget -O $d.flv "$y/get_video.php?video_id=$d&t=$(curl -s "$y/watch?v=$d"|sed -n 's/.* "t": "\([^"]*\)",.*/\1/p')";done
2009-08-22 21:31:29
User: matthewbauer
Functions: echo grep sed
3

This will download a Youtube playlist and mostly anything http://code.google.com/apis/youtube/2.0/reference.html#Video_Feeds

The files will be saved by $id.flv

aria2c -s 4 http://my/url
2009-08-11 22:34:00
User: jrk
8

`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.

cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
2009-08-05 13:33:06
Functions: cat
-1

If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem.

Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading.