Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree

Terminal - DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree
find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
2009-06-07 03:14:06
Functions: find perl rm xargs
19
DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree

This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).

Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.

md5sum can be substituted with sha1sum without problems.

The actual filename is not taken into account-just the hash is used.

Whatever sort thinks is the first filename is kept.

It is assumed that the filename does not contain 0x00.

As per the good suggestion in the first comment, this one does a hard link instead:

find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'

Alternatives

There are 2 alternatives - vote for the best!

Terminal - Alternatives

Know a better way?

If you can do better, submit your command here.

What others think

Very cute.

Alternatively, if you want to keep the appearance but still save space, you can use the "hardlink" command. This searches recursively for duplicates and replaces them all with hard links.

Comment by flatcap 320 weeks and 5 days ago

fdupes its just simple! If you just want one of the files type:

fdupes -fN
Comment by debuti 320 weeks and 4 days ago

Your point of view

You must be signed in to comment.