Probably only works with GNU du and modern perls. Show Sample Output
parse `lsmod' output and pass to `dot' drawing utility then finally pass it to an image viewer
OK, not the most useful but a good way to impress friends. Requires the "display" command from ImageMagick.
This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).
Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.
md5sum can be substituted with sha1sum without problems.
The actual filename is not taken into account-just the hash is used.
Whatever sort thinks is the first filename is kept.
It is assumed that the filename does not contain 0x00.
As per the good suggestion in the first comment, this one does a hard link instead:
find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'
Show Sample Output
This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:
define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}
If your version of grep doesn't have perl compatible regex support, then you can use this version:
define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
Show Sample Output
tail with coloured output with the help of perl - need more colours? here is a colour table: http://www.tuxify.de/?p=23 Show Sample Output
Can be run as a script `ftrace` if my_command is substrituted with "$@" It is useful when running a command that fails and you have the feeling it is accessing a file you are not aware of. Show Sample Output
Print out list of all branches with last commit date to the branch, including relative time since commit and color coding. Show Sample Output
Written by jmcnamara Taken from http://www.perlmonks.org/?node_id=274896 Show Sample Output
attribution: Thanks to repellent on perlmonks.org source: http://www.perlmonks.org/?node_id=684459
works on Linux and Solaris. I think it will work on nearly all *nix-es Show Sample Output
This is a command that I find myself using all the time. It works like regular grep, but returns the paragraph containing the search pattern instead of just the line. It operates on files or standard input.
grepp <PATTERN> <FILE>
or
<SOMECOMMAND> | grepp <PATTERN>
Show Sample Output
Fetches the IPs and ONLY the IPs from ifconfig. Simplest, shortest, cleanest. Perl is too good to be true... (P.S.: credit should go to Peteris Krumins at catonmat.net) Show Sample Output
Silly Perl variant.
the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back
change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.
My Programming Languages professor assigned my class a homework assignment where we had to write a Perl interpreter using Perl. I really like Python's interactive command line interpreter which inspired this Perl script.
Fun idea! This one adds seconds and keeps running on the same line. Perl's probably cheating though. :)
The above one-liner could be run against all HTML files in a directory. It renames the HTML files based on the text contained in their title tag. This helped me in a situation where I had a directory containing thousands of HTML documents with meaningless filenames.
The Linux /dev/full file simulates a "disk full" condition, and can be used to verify how a program handles this situation. In particular, several programming language implementations do not print error diagnostics (nor exit with error status) when I/O errors like this occur, unless the programmer has taken additional steps. That is, simple code in these languages does not fail safely. In addition to Perl, C, C++, Tcl, and Lua (for some functions) also appear not to fail safely. Show Sample Output
Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file. Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.
Based on the execute with timeout command in this site. A more complex script: #!/bin/sh # This script will check the avaliability of a list of NFS mount point, # forcing a remount of those that do not respond in 5 seconds. # # It basically does this: # NFSPATH=/mountpoint TIMEOUT=5; perl -e "alarm $TIMEOUT; exec @ARGV" "test -d $NFSPATH" || (umount -fl $NFSPATH; mount $NFSPATH) # TIMEOUT=5 SCRIPT_NAME=$(basename $0) for i in $@; do echo "Checking $i..." if ! perl -e "alarm $TIMEOUT; exec @ARGV" "test -d $i" > /dev/null 2>&1; then echo "$SCRIPT_NAME: $i is failing with retcode $?."1>&2 echo "$SCRIPT_NAME: Submmiting umount -fl $i" 1>&2 umount -fl $i; echo "$SCRIPT_NAME: Submmiting mount $i" 1>&2 mount $i; fi done
Just want to post a Perl alternative. Does not count hidden files ('.' ones). Show Sample Output
bash.org is a collection of funny quotes from IRC.
WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them...
Thanks to Chen for the idea and initial version!
This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one.
Just barely under the required 255 chars :)
Improvment:
You can replace the head -1 at the end by:
awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt
which will separate the quotes with a "%" and place it in the file.
and then:
strfile bash_quotes.txt
which will make the file ready for the fortune command
and then you can:
fortune bash_quotes.txt
which will give you a random quote from those in the downloaded file.
I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal.
Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: