commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This truncates any lines longer than 80 characters. Also useful for looking at different parts of the line, e.g. cut -b 50-100 shows columns 50 through 100.
faster than lsof by at least x2 on my box.
This command will grep the entire directory looking for any files containing the list of files. This is useful for cleaning out your project of old static files that are no longer in use. Also ignores .svn directories for accurate counts. Replace 'static/images/' with the directory containing the files you want to search for.
This example summarize size of all pdf files in /tmp directory and its subdirectories (in bytes).
Replace "/tmp" with directory path of your choice and "\*pdf" or even "-iname \*pdf" with your own pattern to match specific type of files. You can replace also parameter for du to count kilo or megabytes, but because of du rounding the sum will not be correct (especially with lot of small files and megabytes counting).
In some cases you could probably use sth like this:
du -cb `find /tmp -type f -iname \*pdf`|tail -n 1
But be aware that this second command CANNOT count files with spaces in their names and it will cheat you, if there are some files matching the pattern that you don't have rights to read. The first oneliner is resistant to such problems (it will not count sizes of files which you cant read but will give you correct sum of rest of them).
queries local memcached for stats, calculates hit/get ratio and prints it out.
If you have some textfile with an unknown encoding you can use this list to find out
I know there are a lot of random password generators out there, but I wanted something that put out something besides hex. Set count equal to the number of bytes you want.
Easily list all users
only works for freeBSD where ports are installed in /usr/ports
credit to http://wiki.freebsd.org/PortsTasks
This command does a tally of concurrent active connections from single IPs and prints out those IPs that have the most active concurrent connections. VERY useful in determining the source of a DoS or DDoS attack.
This will tell you who has the most Apache connections by IP (replace IPHERE with the actual IP you wish to check). Or if you wish, remove | grep -c IPHERE for the full list.
List top 20 IP from which TCP connection is in SYN_RECV state.
Useful on web servers to detect a syn flood attack.
Replace SYN_ with ESTA to find established connections
usefull in case of abuser/DoS attacks.
checking files in current and sub directories, finding out the files containing "sampleString" and removing the containing lines from the file.
* Beware that The command will update the original file [no backup].
The command can be extended if play with 'find' command together,
e.g. it is possible to execute on certain type of files: *.xml, *.txt... (find -name "*.xml" | grep....)
if anybody knows a better solution on that, please drop a comment. thx.
Create a tgz archive of all the files containing local changes relative to a subversion repository.
Add the '-q' option to only include files under version control:
svn st -q | cut -c 8- | sed 's/^/\"/;s/$/\"/' | xargs tar -czvf ../backup.tgz
Useful if you are not able to commit yet but want to create a quick backup of your work. Of course if you find yourself needing this it's probably a sign you should be using a branch, patches or distributed version control (git, mercurial, etc..)
Use the aliased command 'nsl'
Greps IRC logs for phrases and lists users who said them.