The example will create a directory called " Caveats: @imports of css files will not be converted.
the good: Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.4 with Suhosin-Patch the bad: Server: Microsoft-IIS/6.0 and the ugly: Server: Apache/2.2.10 (Win32) mod_ssl/2.2.10 OpenSSL/0.9.8i PHP/5.2.6
Mask the user agent as firefox, recursively download 2 levels deep from a span host with a maximum of 1 redirection, use random wait time and dump all pdf files to myBooksFolder without creating any other directories. Host will have no way of knowing that this is a grabber script.
allow multiword translations Show Sample Output
See man wget if you want linked files and not only those hosted on the website.
If the username includes an @ you can use this one: wget -r --user=username_here --password=pass_here ftp://ftp.example.com
Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for Spider mode enabled. Check if remote file exists. --2009-03-31 20:42:46-- http://www.example.com/ Resolving www.example.com... 208.77.188.166 Connecting to www.example.com|208.77.188.166|:80... connected. HTTP request sent, awaiting response... and the second one looks for Length: 438 [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. Show Sample Output
The above url contains over 6700 of the common ad websites. The command just pastes these into your /etc/hosts. Show Sample Output
Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use. The most simple application is to limit the bandwidth usage of programs.
Seeing that we get back plain text anyway we don't need lynx. Also the sed-part removes the credit line.
Little faster alternative.
The download content part. NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example. UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.
I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified.
You can use curl instead if you have it installed, like this:
curl --head -s http://osswin.sourceforge.net | grep Mod
Show Sample Output
you may want &hl=en for &hl=es for the language you may want imgsz=xxlarge for imgsz=large or whatever filter you may want q=apples or whatever
Intended for dynamic ip OpenDNS users, this command will update your OpenDNS network IP. For getting your IP, you can use one of the many one-liners here on commandlinefu. Example: I use this in a script which is run by kppp after it has successfully connected to my ISP: --- #!/bin/bash IP="`curl -s http://checkip.dyndns.org/ | grep -o '[[:digit:].]\+'`" PW="hex-obfuscated-pw-here" if [ "$IP" == "" ] ; then echo 'Not online.' ; exit 1 else wget -q --user=topsecret --password="`echo $PW | xxd -ps -r`" 'https://updates.opendns.com/nic/update?hostname=myhostname&myip='"$IP" -O - /etc/init.d/ntp-client restart & fi --- PS: DynDNS should use a similar method, if you know the URL, please post a comment. (Something with members.dyndns.org, if I recall correctly) Show Sample Output
I dont have curl or links installed, so I use wget with write file as standard out. Show Sample Output
Copy the link to an HD movie trailer in to this command. It's more eleganant if it's put in a to a script, taking the URL as input.
The original was a little bit too complicated for me. This one does not use any variables.
Uses google api to translate, you can modify the language in which translate modifying the parameter "langpair=|en", the format is language input|language output.
substitute "example" with desired string; tl = target language (en, fr, de, hu, ...); you can leave sl parameter as-is (autodetection works fine) Show Sample Output
- Where $URL is the URL of the file. - Replace the $2 by $3 at the end to get a human-readable size. Credits to svanberg @ ArchLinux forums for original idea. Edit: Replaced command with better version by FRUiT. (removed unnecessary grep)
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: