commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Usage: clfavs username password num_favourite_commands file_in_which_to_backup
i sorta stole this from
but it didn't work, so here it is, fixed.
updated to work with jpegs, and to use a fancy positive look behind assertion.
I don't know if the --spider option works to execute a script, but it might be worth trying. Note that the Drupal project uses the following in a cron job.
wget -O - -q http://localhost/drupal/cron.php
The output is sent to standard out so it can be logged by cron.
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null
[Note: This command needs to be run as root].
If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.
I wanted all the 'hidden' .flv files from the http link in the command line; wget seemed appropriate, fed with output from lynx, grep the flv files and the normalised via sed (to remove the numeric bullet). Similar to the 'Grab mp3 files' fu. Replace link with your own, grep arg with something more interesting ;) See here for something along the same lines...
Hope you find it useful! Improvements welcome, naturally.
substitute the URL with your private/public XML url from calendar sharing settings
substitute the dates YYYY-mm-dd
adjust the perl parsing part for your needs
This command might not be useful for most of us, I just wanted to share it to show power of command line.
Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.
This command removes numbers and single characters from count. I'm sure you can write a shorter version.
this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.
(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )
The download content part.
NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example.
UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.
In order to do that, first you need to save a cookie file with your account info. These commands do it (maybe you need to create the '.cookies' dir before). Also, you need to check the "Direct downloads" option on the Premium Zone >> Settings tab.
You need to do this once (as long you maintain the file or your Rapidshare Premium account).
when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key
If the username includes an @ you can use this one:
wget -r --user=username_here --password=pass_here ftp://ftp.example.com