This will log your internet download speed.
You can run
gnuplot -persist <(echo "plot 'bps' with lines")
to get a graph of it.
Requires aria2c but could just as easily wget or anything else. A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-) Show Sample Output
`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.
Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use. The most simple application is to limit the bandwidth usage of programs.
Requires a listening port on HOST eg. "cat movie.mp4 | nc -l 1356 " (cat movie.mp4 | nc -l PORT) Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth. You can't seek (it'll crash and kill the stream) but you can pause it.
Securely stream a file from a remote server (and save it locally). Useful if you're impatient and want to watch a movie immediately and download it at the same time without using extra bandwidth. This is an extension of snipertyler's idea. Note: This command uses an encrypted connection, unlike the original. Show Sample Output
jrk's aria2 example is incorrect. -s specifies the global connection limit; the per-host connection limit is specified with -x.
This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser). You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable). Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ). I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission). Happy downloading!
usage: tpb searchterm example: tpb the matrix trilogy This searches for torrents from thepiratebay and displays the top results in reverse order, so the 1st result is at the bottom instead of the top -- which is better for command line users
This will download a Youtube playlist and mostly anything http://code.google.com/apis/youtube/2.0/reference.html#Video_Feeds The files will be saved by $id.flv
Zsync is an implementation of rsync over HTTP that allows updating of files from a remote Web server without requiring a full download. For example, if you already have a Debian alpha, beta or RC copy downloaded, zsync can just download the updated bits of the new release of the file from the server. This requires the distributor of the file to have created a zsync build control file (using zsyncmake).
This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.
Axel --max-speed=x, -s x You can specify a speed (bytes per second) here and Axel will try to keep the average speed around this speed. Useful if you don?t want the program to suck up all of your bandwidth.
[Note: This command needs to be run as root]. If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.
Throttle download speed
aria2c --max-download-limit=100K file.metalink
Throttle upload speed
aria2c --max-upload-limit=100K file.torrent
Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html). The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz. Paste youtube users and playlists for extra fun. Protip: git-annex loves these files Show Sample Output
Grab the RSS link to the Picasa album. Feed it to the script when its hungry. When its done writing the shopping list, just use
wget -c -i wgetlist
to get your stuff.
Download the last show on your TiVo DVR. Replace $MAK with your MAK see https://www3.tivo.com/tivo-mma/showmakey.do Replace $tivo with your TiVo's IP
umph is parsing video links from Youtube playlists ( http://code.google.com/p/umph/ )
cclive is downloading videos from Youtube ( http://cclive.sourceforge.net/ )
Example:
yt-pl2mp3 7AB74822FE7D03E8
If your version of curl does not support the --compressed option, use
curl -s http://funnyjunk.com | gunzip
instead of
curl -s --compressed http://funnyjunk.com
Recursively download all files of a certain type down to two levels, ignoring directory structure and local duplicates. Usage: wgetall mp3 http://example.com/download/
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: