Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged wget from sorted by
Terminal - Commands tagged wget - 89 results
clfavs(){ URL="http://www.commandlinefu.com";wget -O - --save-cookies c --post-data "username=$1&password=$2&submit=Let+me+in" $URL/users/signin;for i in `seq 0 25 $3`;do wget -O - --load-cookies c $URL/commands/favourites/plaintext/$i >>$4;done;rm -f c;}
2009-09-30 16:43:08
User: suhasgupta
Functions: c++ wget
24

Usage: clfavs username password num_favourite_commands file_in_which_to_backup

display "$(wget -q http://dynamic.xkcd.com/comic/random/ -O - | grep -Po '(?<=")http://imgs.xkcd.com/comics/[^"]+(png|jpg)')"
2009-09-14 18:40:14
User: syssyphus
Tags: wget xkcd
6

i sorta stole this from

http://www.shell-fu.org/lister.php?id=878#MTC_form

but it didn't work, so here it is, fixed.

---

updated to work with jpegs, and to use a fancy positive look behind assertion.

wget -q --spider http://server/cgi/script
2009-09-11 05:33:48
User: ashawley
Functions: wget
Tags: wget
0

I don't know if the --spider option works to execute a script, but it might be worth trying. Note that the Drupal project uses the following in a cron job.

wget -O - -q http://localhost/drupal/cron.php

The output is sent to standard out so it can be logged by cron.

wget -O /dev/null http://www.google.com
2009-09-10 14:43:57
Functions: wget
Tags: wget
3

I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null

while [ -n "`pgrep wget`" ]; do sleep 2 ;done; [ -e "/tmp/nosleep"] || echo mem >/sys/power/state
2009-09-06 05:51:20
User: kamathln
Functions: echo sleep
1

[Note: This command needs to be run as root].

If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.

wget `lynx -dump http://www.ebow.com/ebowtube.php | grep .flv$ | sed 's/[[:blank:]]\+[[:digit:]]\+\. //g'`
2009-08-02 14:09:53
User: spaceyjase
Functions: grep sed wget
3

I wanted all the 'hidden' .flv files from the http link in the command line; wget seemed appropriate, fed with output from lynx, grep the flv files and the normalised via sed (to remove the numeric bullet). Similar to the 'Grab mp3 files' fu. Replace link with your own, grep arg with something more interesting ;) See here for something along the same lines...

http://www.commandlinefu.com/commands/view/1006/grab-mp3-files-from-your-favorite-netcasts-mp3blog-or-sites-that-often-have-good-mp3s

Hope you find it useful! Improvements welcome, naturally.

wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane [email protected]=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
2009-07-23 14:48:54
Functions: perl wget
0

substitute the URL with your private/public XML url from calendar sharing settings

substitute the dates YYYY-mm-dd

adjust the perl parsing part for your needs

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
2009-03-31 17:50:41
User: nadavkav
Functions: wget
4

this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.

(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )

wget -c -t 1 --load-cookies ~/.cookies/rapidshare <URL>
2009-03-28 09:13:35
User: cammarin
Functions: wget
6

The download content part.

NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example.

UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.

wget --save-cookies ~/.cookies/rapidshare --post-data "login=USERNAME&password=PASSWORD" -O - https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi > /dev/null
2009-03-28 09:12:02
User: cammarin
Functions: wget
4

In order to do that, first you need to save a cookie file with your account info. These commands do it (maybe you need to create the '.cookies' dir before). Also, you need to check the "Direct downloads" option on the Premium Zone >> Settings tab.

You need to do this once (as long you maintain the file or your Rapidshare Premium account).

wget -qO- whatismyip.org
wget -q http://xyz.gpg -O- | sudo apt-key add -
2009-03-25 12:18:36
User: gnuyoga
Functions: sudo wget
Tags: wget apt-key
4

when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key

wget -r ftp://user:[email protected]
2009-03-09 19:39:30
User: movaxes
Functions: wget
Tags: wget
6

If the username includes an @ you can use this one:

wget -r --user=username_here --password=pass_here ftp://ftp.example.com