To allow recursivity :
find -type f -exec md5sum '{}' ';' | sort | uniq -c -w 33 | sort -gr | head -n 5 | cut -c1-7,41-
Display only filenames :
find -maxdepth 1 -type f -exec md5sum '{}' ';' | sort | uniq -c -w 33 | sort -gr | head -n 5 | cut -c43-
Show Sample Output
Useful in scripts to compare known malicious IP addresses with what you are actually blocking. Show Sample Output
Check the connection of the maximum number of IP Show Sample Output
It works extremely fast, because it calculates md5sum only on the files that have the same size and name. But there is nothing for free - it won't find duplicates with the different names. Show Sample Output
if you have less free inode space alert in your monitoring app you can use this command to find the any directory with too many files and subdirectory . Show Sample Output
If you have two sets of files that may share hard-linked files, it can be useful to identify which ones are hard links to same underlying inode (file). This command shows you all of those, sorted by inode#. In my example the two directory trees to compare share a common parent, so I run the command in that parent and just use
find .
to start from the current directory. If yours are in different locations, you can pass both paths to find:
find /directory1 /directory2 -type f -printf '%10i %p\n' | sort | uniq -w 11 -d -D | less
Show Sample Output
This command will generate "CHECK TABLE `db_name.table_name` ;" statements for all tables present in databases on a MySQL server, which can be piped into the mysql command. (Can also be altered to perform OPTIMIZE and REPAIR functions.) Tested on MySQL 4.x and 5.x systems in a Linux environment under bash. Show Sample Output
Can be used to discover what programms create internet traffic. Skip the part after awk to get more details. Has anyone an idea why the uniq doesn't work propperly here (see sample output)? Show Sample Output
You'll run into trouble if you have files w/ missing newlines at the end. I tried to use
PAGER='sed \$q' git blame
and even
PAGER='sed \$q' git -p blame
to force a newline at the end, but as soon as the output is redirected, git seems to ignore the pager.
print members both in file1 and file2
First column is number of photos, second column is the focal length. Show Sample Output
Per country GET report, based on access log. Easy to transform to unique IP Show Sample Output
#_connects src_IP dst_IP When_It_Happened_Secs Show Sample Output
Output contains also garbage (text parts from netstat's output) but it's good enough for quick check who's overloading your server.
This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive Show Sample Output
Btrfs reports the inode numbers of files with failed checksums. Use `find` to lookup the file names of those inodes.
display IP's that unsuccessfully attempted to login 5 or more times today may want to filter any trusted IP's and the localhost useful for obtaining a list IP addresses to block on the firewall Show Sample Output
Useful to check DDoS attacks on servers. Show Sample Output
Searches /var/log/secure for smtp connections then lists these by number of connections made and hosts.
I make an extensive use of sudo, so I had to exclude the sudo part of the command history
This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: