commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Search in all html files and remove the lines that 'String' is found.
Find and kill multiple instances of a process with one simple command.
Loads your CPU, run a instance for each CPU/CORE.
use manpages, they give you "ultimate commands"
"ls -SshF --color" list by filesize (biggest at the top)
"ls -SshFr --color" list by filesize in reverse order (biggest at the bottom)
If curl isn't available, use lynx.
use xdg-open without looking at error messages
(nautilus:3955): Gtk-WARNING **: Theme parsing error: Notebook.css:21:15: Junk at end of value
Replace D drive with mounted ISO virtual drive and Replace E with your USB drive letter.
avoid rm to be recursive until you complete the command: put the -rf at the end!
Useful for situations where you have word lists or dictionaries that range from hundreds of megabytes to several gigabytes in size. Replace file.lst with your wordlist, replace 50000 with however many lines you want the resulting list to be in total. The result will be redirected to output.txt in the current working directory. It may be helpful to run wc -l file.lst to find out how many lines the word list is first, then divide that in half to figure out what value to put for the head -n part of the command.
Good for finding outdated timthumb.php scripts which need to be updated, anything over 2.0 should be secure, below that timthimb is vulnerable and can be used to compromise your website.
"play" is part of "SoX"
SoX - Sound eXchange, the Swiss Army knife of audio manipulation.
For details, see: man sox
biggest->small directories, then biggest->smallest files
It's useful mostly for your custom scripts, which running on specific host and tired on ssh'ing every time when you need one simple command (i use it for update remote apt repository, when new package have to be downloaded from another host).
Don't forget to set up authorization by keys, for maximum comfort.
If you need to fix a randomly failing test (race condition), you need to run it until you get that hard-to-reproduce failure.
Run this within a steady screen session.
You can get the approximate time when the remote server went down or other abnormal behavior.