commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Searches for *.cpp and *.h in directory structure, counts the number of lines for each matching file and adds the counts together.
In OSX you would have to make sure that you "sudo -s" your way to happiness since it will give a few "Permission denied" errors before finally spitting out the results. In OSX the directory structure has to start with the "Users" Directory then it will recursively perform the operation.
Your Lord and master,
the -h option of du and sort (on appropriate distrib) makes output "Human" readable and still sorted by "reversed size" (sort -rh)
The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP.
The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count.
The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk.
Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it.
Generates the list of clients (IPs addresses) that have used the Squid webproxy according to the most recent log. Every IP appears only once in the list.
This probably only works without modifications in RHEL/CentOS/Fedora.
This command will find the highest context switches on a server and give you the process listing.
The command will make it easy to determine free IP ranges in a crowded sub-net.
Need admin right to run dpkg-query
Get link to external sites of a url
This sorts files in multiple directories by their modification date. Note that sorting is done at the end using "sort", instead of using the "-ltr" options to "ls". This ensures correct results when sorting a large number of files, in which case "find" will call "ls" multiple times.
Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server. When i saw the command i had some ideas to make it shorter. Here is my version.
This command give a human readable result without messing up the sorting.
The following displays only the entries that are duplicates.
Uniq command is mostly used in combination with sort command, as
uniq removes duplicates only from a sorted file. i.e In order for uniq to
work, all the duplicate entries should be in the adjacent lines.
ls -al gives all files, sort +4n sorts by 5th field numerically
Sort using kth column using : delimiter
I had the problem that our monitoring showed that the "/" filesystem is >90% full. This command helped me to find out fast which subdirs are the biggest. The system has many NFS-mounts therefore the -x.