Commands using readlink (26)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list files recursively by size

processes per user counter
enumerates the number of processes for each user. ps BSD format is used here , for standard Unix format use : ps -eLf |awk '{$1} {++P[$1]} END {for(a in P) if (a !="UID") print a,P[a]}'

Count number of hits per IP address in last 2000 lines of apache logs and print the IP and hits if hits > 20

Drop or block attackers IP with null routes
Someone might attack on your system. You can drop attacker IP using IPtables. However, you can use route command to null route unwanted traffic. A null route (also called as blackhole route) is a network route or kernel routing table entry that goes nowhere. Matching packets are dropped (ignored) rather than forwarded, acting as a kind of very limited firewall. The act of using null routes is often called blackhole filtering.

Download and install the newest dropbox beta

search for files or directories, then show a sorted list of just the unique directories where the matches occur
Ever use 'locate' to find a common phrase in a filename or directory name? Often you'll get a huge list of matches, many of which are redundant, and typically the results are not sorted. This command will 'locate' your search phrase, then show you a sorted list of just the relevant directories, with no duplications. So, for example, maybe you have installed several versions of the java jre and you want to track down every directory where files matching "java" might exist. Well, a 'locate java' is likely to return a huge list with many repeated directories since many files in one directory could contain the phrase "java". This command will whittle down the results to a minimal list of unique directory names where your search phrase finds a match.

list files recursively by size

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

get a mysqldump with a timestamp in the filename and gzip it all in one go
Performs a mysqldump and gzip-compresses the output file with a timestamp in the resulting dump file. Inspect the file for integrity or fun with this command afterward, if you desire: $ zcat mysqldump-2009-06-12-07.41.01.tgz | less


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: