Commands tagged cntlm (1)

  • If you have to deal with MS Sharepoint which is (rarely, let's hope) used in e.g. certain corporate environments). This uses Cntlm. For single files, just use cURL -- its NTLM authentication works quite well. # /etc/cntlm.conf: # Username account # Domain domain # Password ############ # Proxy 10.20.30.40 (IP of the sharepoint site) # NoProxy * # Listen 3128


    1
    http_proxy=http://127.0.0.1:3128 wget --http-user='domain\account' --http-password='###' -p -r -l 8 --no-remove-listing -P . 'http://sp.corp.com/teams/Team/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fteams%2fTeam%2fShared%20Documents%2fFolder'
    mhs · 2012-12-26 09:03:55 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Examine processes generating traffic on your website
I often have to google this so I put it here for quick reference.

Find file containing namespace in a directory of jar files.
You could subsitute javax.servlet for any namespace you need.

Printable random characters
Reads psuedorandom bytes from /dev/urandom, filtering out non-printable ones. Other character classes can be used, such as [:alpha:], [:digit:] and [:alnum:]. To get a string of 10 lowercase letters: $ tr -dc '[:lower:]' < /dev/urandom | head -c 10

Calculates the date 2 weeks ago from Saturday the specified format.
Good for automating reports that need to run from between two dates.

List all directories only.
Undocumented syntax, but should work on every shell. It'll list all directories in the current one. Change `*/` into globbing `**/` for recursivity.

Count number of Line for all the files in a directory recursively

Every Nth line position # (SED)
sed extract every nth line. Generic is: $ sed -n 'STARTPOSITION,${p;n;*LINE}' foo where n;*LINE = how many lines. thus p;n;n; is "for every 3 lines" and p;n;n;n;n; is "for every 5 lines"

Efficiently print a line deep in a huge log file
Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using $ sed '999995,1000005!d' < my_massive_file

Copy via tar pipe while preserving file permissions (cp does not!; run this command with root!)
cp options: -p will preserve the file mode, ownership, and timestamps -r will copy files recursively also, if you want to keep symlinks in addition to the above: use the -a/--archive option

Get a funny one-liner from www.onelinerz.net
Put this command in .bashrc and every time you open a new terminal a random quote will be downloaded and printed from onelinerz.net. By altering the URL in the w3m statement you can change the output: 1 to 10 lines - http://www.onelinerz.net/random-one-liners/(number)/ 20 newest lines - http://www.onelinerz.net/latest-one-liners/ Top 10 lines - http://www.onelinerz.net/top-100-funny-one-liners/ Top 10 lines are updated daily.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: