commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This one uses dictionary.com
This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.
Change the name of the process and what is echoed to suit your needs. The brackets around the h in the grep statement cause grep to skip over "grep httpd", it is the equivalent of grep -v grep although more elegant.
Works recusivley in the specified dir or '.' if none given.
Repeatedly calls 'find' to find a newer file, when no newer files exist you have the newest.
In this case 'newest' means most recently modified. To find the most recently created change -newer to -cnewer.
If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.
** Replace the ... in URLS with:
Couldn't fit in 256
Created on Ubuntu 9.10 but nothing out of the ordinary, should work anywhere with a little tweaking. 5163 is the number of unique first names you get when combine the male and female first name files from. http://www.census.gov/genealogy/www/data/1990surnames/names_files.html
Get the line containing "inet addr:" and the line before that, get down to only the first line, and then get the first word on that line, which should be the interface.
A slightly shorter version. Also doesn't put a return character at the end of the password
Uses the dumb terminal option in gnuplot to plot a graph of frequencies. In this case, we are looking at a frequency analysis of words in all of the .c files.
This uses urandom to produce a random password. The random values are uuencoded to ensure only printable characters. This only works for a number of characters between 1 and 60.
Similar but using mediainfo instead of totem-something
note the xargs at the end
Some malicious program appends a iframe or script tag to you web pages on some server, use this command to clean them in batch.
Makes use of $RANDOM environment variable.
'jot' does not come with most *nix distros, so we need to use seq to make it work. This version tested good on Fedora 11.
This is an extension of a previous command by satyavvd on 2009-07-23 12:04:02, but this one grabs the whole archive. Hard coded numbers in previous script capped number of commands that could be fetched. This one grabs them all regardless of how big the archive gets.
List files and sizes