commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Throttle download speed
aria2c --max-download-limit=100K file.metalink
Throttle upload speed
aria2c --max-upload-limit=100K file.torrent
--max-speed=x, -s x
You can specify a speed (bytes per second) here and Axel will
try to keep the average speed around this speed. Useful if you
don?t want the program to suck up all of your bandwidth.
usage: tpb searchterm
example: tpb the matrix trilogy
This searches for torrents from thepiratebay and displays the top results in reverse order,
so the 1st result is at the bottom instead of the top -- which is better for command line users
If your version of curl does not support the --compressed option, use
curl -s http://funnyjunk.com | gunzip
curl -s --compressed http://funnyjunk.com
This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser).
You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable).
Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ).
I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission).
urls.txt should have a fully qualified url on each line
to clear the log
change curl command to
curl --head $file | head -1 >> log.txt
to just get the http status
This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.
Requires aria2c but could just as easily wget or anything else.
A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-)
Zsync is an implementation of rsync over HTTP that allows updating of files from a remote Web server without requiring a full download. For example, if you already have a Debian alpha, beta or RC copy downloaded, zsync can just download the updated bits of the new release of the file from the server.
This requires the distributor of the file to have created a zsync build control file (using zsyncmake).
Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use.
The most simple application is to limit the bandwidth usage of programs.
Download the last show on your TiVo DVR.
Replace $MAK with your MAK see https://www3.tivo.com/tivo-mma/showmakey.do
Replace $tivo with your TiVo's IP
Grab the RSS link to the Picasa album. Feed it to the script when its hungry. When its done writing the shopping list, just use
wget -c -i wgetlist
to get your stuff.
This will log your internet download speed.
You can run
gnuplot -persist <(echo "plot 'bps' with lines")
to get a graph of it.
[Note: This command needs to be run as root].
If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.
This will download a Youtube playlist and mostly anything http://code.google.com/apis/youtube/2.0/reference.html#Video_Feeds
The files will be saved by $id.flv