-p parameter tells wget to include all files, including images. -e robots=off you don't want wget to obey by the robots.txt file -U mozilla as your browsers identity. --random-wait to let wget chose a random number of seconds to wait, avoid get into black list. Other Useful wget Parameters: --limit-rate=20k limits the rate at which it downloads files. -b continues wget after logging out. -o $HOME/wget_log.txt logs the output
Usage:
translate <phrase> <source-language> <output-language>
Example:
translate hello en es
See this for a list of language codes:
http://en.wikipedia.org/wiki/List_of_ISO_639-1_codes
Show Sample Output
EDIT: command updated to support accented characters! Works in any of 58 google supported languages (some sound like crap, english is the best IMO). You get a mp3 file containing your query in spoken language. There is a limit of 100 characters for the "q" parameter, so be careful. The "tl" parameter contains target language.
This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.
Nothing special required, just wget, sed & tr! Show Sample Output
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules Show Sample Output
Usage: clfavs username password num_favourite_commands file_in_which_to_backup
This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.
To get a random xkcd comic, I also use the following:
xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
Uses htmldoc to perform the conversion
Very nice command when you want to download a site locally to your machine, including images, css and javascript
If the site uses https, use:
wget --reject html,htm --accept pdf,zip -rl1 --no-check-certificate https-url
This will uncompress the file while it's being downloaded which makes it much faster
Usage: t2s 'How are you?' Nice because it automatically names the mp3 file up to 15 characters Modified (uses bash manip instead of tr) t2s() { wget -q -U Mozilla -O $(cut -b 1-15
Self-referential use of wget. Show Sample Output
This one uses dictionary.com
Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.
This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.
If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.
wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
The file lastfact, can then be examined whenever the command fails.
Check if a site is down with downforeveryoneorjustme.com
Curl is not installed by default on many common distros anymore. wget always is :)
wget -qO- ifconfig.me/ip
xargs can be used in this manner to download multiple files at a time, and xargs will in this case run 10 processes at a time and initiate a new one when the number running falls below 10. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: