Check These Out
useful to find the list of dependencies
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"
Finds all *.p[ml]-files and runs a perl -c on them, checking whether Perl thinks they are syntactically correct
Gets the latest Tweets in your friends timeline from Twitter. Uses curl and xmlstarlet.
Barely worth posting because it is so simple, but I use it literally all the time. I was always frustrated by the limitations that a non-gui environment imposes on diff'ing files. This fixes some of those limitations by colourising the output (you'll have to install colordiff, but it is just a wrapper for diff itself), using side-by-side mode for clearer presentation, and of course, the -W parameter, using tput to automatically insert you terminal width. Note that the double quotes aren't necessary if typed into terminal as-is. I included them for safety sake,
Axel
--max-speed=x, -s x
You can specify a speed (bytes per second) here and Axel will
try to keep the average speed around this speed. Useful if you
don?t want the program to suck up all of your bandwidth.
Thanks to flatcap for optimizing this command.
This command takes advantage of the ext4 filesystem's resistance to fragmentation.
By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it )
My command only effects the home directory and only those files with your R/W (read / write ) permissions.
There are two issues with this command:
1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O
2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory
The benefits i managed to work into the command:
1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long
2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c )
3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage
bottom line:
1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer
2. it's my first commandlinefu command