resolving basic authentication problem(401) with wget

wget --auth-no-challenge --server-response -O- $url 2>&1 | grep "Cookie" | sed "s/^ Set-//g" > cookie.txt; wget --auth-no-challenge --server-response --http-user="user" --http-password="pw" --header="$(cat cookie.txt)" -O- $url
I have a server with a php requiring basic authentication, like this: header('WWW-Authenticate: Basic realm="do auth"'); header('HTTP/1.0 401 Unauthorized'); ...?> And the basic authentication in wget do not worked: wget --auth-no-challenge --http-user="username" --http-password="password" -O- "http://url" wget --keep-session-cookies --save-cookies=cookies.txt --load-cookies=cokies.txt --http-user="username" --http-password="password" -O- "http://url" I always received the 401 Authorization failed. The saved cookie is always empty. With my way, I received the header from the server and save the cookie, then resend the session cookie with authentication data
Sample Output
--2010-12-01 09:18:49--  http://your.url/
Connecting to ...:80... connected.
HTTP request sent, awaiting response...
  HTTP/1.1 200 OK
  Content-Type: text/html
Length: 1715 (1.7K) [text/html]
Saving to: `STDOUT'

2010-12-01 11:24:35

These Might Interest You

  • This one uses the history modificator :q to automatically quote previous command. It resolves the already in quotes and by using the single quote it prevents resolving variables on execution. Sample output omits the redirection to file to show the problem essence. Show Sample Output

    echo !!:q >
    kudlaty01 · 2017-10-06 08:49:27 0
  • Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for Spider mode enabled. Check if remote file exists. --2009-03-31 20:42:46-- Resolving Connecting to||:80... connected. HTTP request sent, awaiting response... and the second one looks for Length: 438 [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. Show Sample Output

    wget --server-response --spider
    penpen · 2009-03-31 18:49:14 6
  • The download content part. NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example. UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.

    wget -c -t 1 --load-cookies ~/.cookies/rapidshare <URL>
    cammarin · 2009-03-28 09:13:35 2
  • This is wonderful perl script to check the web server security and vulnerability .Get it from here : Here are some key features of "Nikto": ? Uses rfp's LibWhisker as a base for all network funtionality ? Main scan database in CSV format for easy updates ? Determines "OK" vs "NOT FOUND" responses for each server, if possible ? Determines CGI directories for each server, if possible ? Switch HTTP versions as needed so that the server understands requests properly ? SSL Support (Unix with OpenSSL or maybe Windows with ActiveState's Perl/NetSSL) ? Output to file in plain text, HTML or CSV ? Generic and "server type" specific checks ? Plugin support (standard PERL) ? Checks for outdated server software ? Proxy support (with authentication) ? Host authentication (Basic) ? Watches for "bogus" OK responses ? Attempts to perform educated guesses for Authentication realms ? Captures/prints any Cookies received ? Mutate mode to "go fishing" on web servers for odd items ? Builds Mutate checks based on robots.txt entries (if present) ? Scan multiple ports on a target to find web servers (can integrate nmap for speed, if available) ? Multiple IDS evasion techniques ? Users can add a custom scan database ? Supports automatic code/check updates (with web access) ? Multiple host/port scanning (scan list files) ? Username guessing plugin via the cgiwrap program and Apache ~user methods Show Sample Output

    0 -h yourwebserver
    unixbhaskar · 2009-08-29 04:54:43 0

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: