This will take the packages matching a given `apt-cache search` query (a collection of AND'd words or regexps) and tell you how popular they are. This is particularly nice for those times you have to figure out which solution to use for e.g. a PDF reader or a VNC client. Substitute "ubuntu.com" for "debian.org" if you want this to use Ubuntu's data instead. Everything else will work perfectly. Show Sample Output
when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key Show Sample Output
In order to do that, first you need to save a cookie file with your account info. These commands do it (maybe you need to create the '.cookies' dir before). Also, you need to check the "Direct downloads" option on the Premium Zone >> Settings tab. You need to do this once (as long you maintain the file or your Rapidshare Premium account).
this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally. (do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )
to download latest version of "util", maybe insert a sort if they wont be shown in right order. curl lists all files on mirror, grep your util, tail -1 will gets the one lists on the bottom and get it with wget Show Sample Output
Returns a JSON object, by connecting to the 'test' endpoint of the Twitter API. Simplest way to check if you can connect to Twitter. Output also available in XML, use '/help/test.xml' for that Show Sample Output
Use `tar xj` for bzip2 archives.
Need to have rc iso pre-downloaded before running command.
Add the BackTrack repositories to your Debian based GNU/Linux distribution. Thanks to http://it-john.com/home/technology/linux-technology/add-back-track-4-repo-to-ubuntu/
This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser). You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable). Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ). I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission). Happy downloading!
same but redirecting to player and putting whaever text line.. works on my ubuntu machine ...
usage: tpb searchterm example: tpb the matrix trilogy This searches for torrents from thepiratebay and displays the top results in reverse order, so the 1st result is at the bottom instead of the top -- which is better for command line users
This will download all files of the type specified after "-A" from a website. Here is a breakdown of the options: -r turns on recursion and downloads all links on page -l1 goes only one level of links into the page(this is really important when using -r) -H spans domains meaning it will download links to sites that don't have the same domain -nd means put all the downloads in the current directory instead of making all the directories in the path -A mp3 filters to only download links that are mp3s(this can be a comma separated list of different file formats to search for multiple types) -e robots=off just means to ignore the robots.txt file which stops programs like wget from crashing the site... sorry http://example/url lol..
Will append lines to the hosts file to do some basic ad blocking.
website recursive offline mirror with wget
Writes hybrid ISO directly to USB stick; replace /dev/sdb with USB device in question and the ISO image link with the link of your choice
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.
Seq allows you to define printf like formating by specified with -f, %03g is actually tells seq I got three digits, fill the blank digits with 0, and the range is from 176 to 240.
I wanted all the 'hidden' .flv files from the http link in the command line; wget seemed appropriate, fed with output from lynx, grep the flv files and the normalised via sed (to remove the numeric bullet). Similar to the 'Grab mp3 files' fu. Replace link with your own, grep arg with something more interesting ;) See here for something along the same lines... http://www.commandlinefu.com/commands/view/1006/grab-mp3-files-from-your-favorite-netcasts-mp3blog-or-sites-that-often-have-good-mp3s Hope you find it useful! Improvements welcome, naturally.
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null
Mirror the entire NASA Astronomy Picture of the Day archive, all the way from 1995. The archive is close to 2.5 GB, with lots of files, so give it some time. The logs can be redirected to a file using '-o somefile'. You might also want to try '-nH' and the '--cut-dirs' options
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: