Check These Out
* to get the English dictionary: wget http://www.mavi1.org/web_security/wordlists/webster-dictionary.txt
No need to parse html page, website gives us a txt file :)
My firefox overheats my cpu, sometimes above 90 degrees Celsius ( hence the name? )
To keep an eye on temperature, I put this command inside KAlarm ( a kind of cron) to be repeated every minute, for 5 seconds, color red ( default for osd_cat).
Its pretty, ultra small, displays a micro 2 lines text on every desktop and over everything and do not steal focus or interrupt any task. I get the information passively, in the low profile bottom of the screen.
Of course you can use it inside a terminal. Just do it:
watch -n 60 'acpi -t | osd_cat -p bottom'
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"
This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting.
Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl?
Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links....
Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.
The `export` is unnecessary if it's only applicable to the one command.
this will cause any commands that you have executed in the current shell session to not be written in your bash_history file upon logout