Commands by masterofdisaster (5)

  • When you have different digital cameras, different people, friends and you want to merge all those pictures together, then you get files with same names or files with 3 and 4 digit numbers etc. The result is a mess if you copy it together into one directory. But if you can add an offset to the picture number and set the number of leading zeros in the file name's number then you can manage. OFFS != 0 and LZ the same as the files currently have is not supported. Or left as an exercise, hoho ;) I love NF="${NF/#+(0)/}",it looks like a magic bash spell.


    2
    OFFS=30;LZ=6;FF=$(printf %%0%dd $LZ);for F in *.jpg;do NF="${F%.jpg}";NF="${NF/#+(0)/}";NF=$[NF+OFFS];NF="$(printf $FF $NF)".jpg;if [ "$F" != "$NF" ];then mv -iv "$F" "$NF";fi;done
    masterofdisaster · 2010-11-08 22:48:56 2
  • Nice reading in the morning on the way to work, but sadly the .tar.gz for the whole issue 66 is not on phrack's website yet. So use wget to download.


    2
    mkdir phrack66; (cd phrack66; for n in {1..17} ; do echo "http://www.phrack.org/issues.html?issue=66&id=$n&mode=txt" ; done | xargs wget)
    masterofdisaster · 2009-06-11 21:42:42 1
  • What do you do when nmap is not available and you want to see the hosts responding to an icmp echo request ? This one-liner will print all hosts responding with their ipv4 address.


    3
    ( nw=192.168.0 ; h=1; while [ $h -lt 255 ] ; do ( ping -c2 -i 0.2 -W 0.5 -n $nw.$h & ); h=$[ $h + 1 ] ; done ) | awk '/^64 bytes.*/ { gsub( ":","" ); print $4 }' | sort -u
    masterofdisaster · 2009-06-07 15:14:46 0
  • Sort ls output of all files in current directory in ascending order Just the 20 biggest ones: ls -la | sort -k 5bn | tail -n 20 A variant for the current directory tree with subdirectories and pretty columns is: find . -type f -print0 | xargs -0 ls -la | sort -k 5bn | column -t And finding the subdirectories consuming the most space with displayed block size 1k: du -sk ./* | sort -k 1bn | column -t


    6
    ls -la | sort -k 5bn
    masterofdisaster · 2009-06-07 14:35:17 3
  • This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories). Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd. md5sum can be substituted with sha1sum without problems. The actual filename is not taken into account-just the hash is used. Whatever sort thinks is the first filename is kept. It is assumed that the filename does not contain 0x00. As per the good suggestion in the first comment, this one does a hard link instead: find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }' Show Sample Output


    19
    find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
    masterofdisaster · 2009-06-07 03:14:06 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Determine space taken by files of certain type
Just how much space are those zillions of database logs taking up ? How much will you gain on a compression rate of say 80% ? This little line gives you a good start for your calculations.

Download latest released gitlab docker container
Download latest released gitlab docker container

find all open files by named process
lists all files that are opened by processess named $processname egrep 'w.+REG' is to filter out non file listings in lsof, awk to get the filenames, and sort | uniq to remove duplciation

Read null character seperated fields from a file
Handle any bad named file which contains ",',\n,\b,\t,` etc Store the file name as null character separated list $find . -print0 >name.lst and retrieve it using $read -r -d "" Eg: $find . -print0 >name.lst; $cat name.lst| while IFS="" read -r -d "" file; $do $ls -l "$file"; $done

Sort processes by CPU Usage
Short list about top 10 processes, sorted by CPU usage

easily strace all your apache processes
This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

print crontab entries for all the users that actually have a crontab
This is how I list the crontab for all the users on a given system that actually have a crontab. You could wrap it with a function block and place it in your .profile or .bashrc for quick access. There's prolly a simpler way to do this. Discuss.

rsync with progress bar.
transfer files from localhost to a remotehost.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: