Commands by masterofdisaster (5)

  • When you have different digital cameras, different people, friends and you want to merge all those pictures together, then you get files with same names or files with 3 and 4 digit numbers etc. The result is a mess if you copy it together into one directory. But if you can add an offset to the picture number and set the number of leading zeros in the file name's number then you can manage. OFFS != 0 and LZ the same as the files currently have is not supported. Or left as an exercise, hoho ;) I love NF="${NF/#+(0)/}",it looks like a magic bash spell.


    2
    OFFS=30;LZ=6;FF=$(printf %%0%dd $LZ);for F in *.jpg;do NF="${F%.jpg}";NF="${NF/#+(0)/}";NF=$[NF+OFFS];NF="$(printf $FF $NF)".jpg;if [ "$F" != "$NF" ];then mv -iv "$F" "$NF";fi;done
    masterofdisaster · 2010-11-08 22:48:56 5
  • Nice reading in the morning on the way to work, but sadly the .tar.gz for the whole issue 66 is not on phrack's website yet. So use wget to download.


    2
    mkdir phrack66; (cd phrack66; for n in {1..17} ; do echo "http://www.phrack.org/issues.html?issue=66&id=$n&mode=txt" ; done | xargs wget)
    masterofdisaster · 2009-06-11 21:42:42 5
  • What do you do when nmap is not available and you want to see the hosts responding to an icmp echo request ? This one-liner will print all hosts responding with their ipv4 address.


    3
    ( nw=192.168.0 ; h=1; while [ $h -lt 255 ] ; do ( ping -c2 -i 0.2 -W 0.5 -n $nw.$h & ); h=$[ $h + 1 ] ; done ) | awk '/^64 bytes.*/ { gsub( ":","" ); print $4 }' | sort -u
    masterofdisaster · 2009-06-07 15:14:46 5
  • Sort ls output of all files in current directory in ascending order Just the 20 biggest ones: ls -la | sort -k 5bn | tail -n 20 A variant for the current directory tree with subdirectories and pretty columns is: find . -type f -print0 | xargs -0 ls -la | sort -k 5bn | column -t And finding the subdirectories consuming the most space with displayed block size 1k: du -sk ./* | sort -k 1bn | column -t


    6
    ls -la | sort -k 5bn
    masterofdisaster · 2009-06-07 14:35:17 8
  • This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories). Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd. md5sum can be substituted with sha1sum without problems. The actual filename is not taken into account-just the hash is used. Whatever sort thinks is the first filename is kept. It is assumed that the filename does not contain 0x00. As per the good suggestion in the first comment, this one does a hard link instead: find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }' Show Sample Output


    19
    find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
    masterofdisaster · 2009-06-07 03:14:06 15

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Run a command only when load average is below a certain threshold
Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

Convert spaces in file names to underscores

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

list folders containing less than 2 MB of data
This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: $ while read -r line; do rm -Rv "$line"; done < textfile

Burn CD/DVD from an iso, eject disc when finished.
cdrecord -scanbus will tell you the (x,y,z) value of your cdr (for example, mine is 3,0,0)

Run a command for blocks of output of another command
The given example collects output of the tail command: Whenever a line is emitted, further lines are collected, until no more output comes for one second. This group of lines is then sent as notification to the user. You can test the example with $ logger "First group"; sleep 1; logger "Second"; logger "group"

batch convert Nikon RAW (nef) images to JPG
converts RAW files from a Nikon DSLR to jpg for easy viewing etc. requires ufraw package

a function to create a box of '=' characters around a given string.
First argument: string to put a box around. Second argument: character to use for box (default is '=') Same as command #4948, but shorter, and without the utility function.

Backup all mysql databases to individual files on a remote server
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.

use the previous commands params in the current command
Here the !!:1 will take the first parameter from the previous command. This can be used in conjunction with other history commands like ! and so on.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: