commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
By putting the "-not \( -name .svn -prune \)" in the very front of the "find" command, you eliminate the .svn directories in your find command itself. No need to grep them out.
You can even create an alias for this command:
alias svn_find="find . -not \( -name .svn -prune \)"
Now you can do things like
svn_find -mtime -3
What happens if there is more than a single space between words, or spaces and tabs? This command will remove duplicate spaces and tabs.
The "-r" switch allows for extended regular expressions. No additional piping necessary.
shortest alternative without the speed-o-meter"xclip large.xml"
"xclip -o" to get the clipboard content, alternatively [shift key] + insert or middle button of your mouse.
OK, not the most useful but a good way to impress friends. Requires the "display" command from ImageMagick.
avoid mouse abuse and the constant struggle of balancing scroll velocity ... not to mention that burning sensation in your upper right shoulder ....
this is useful to highlight only some code without losing other lines (eg. software, logs, scripts)
In turn you can get the contents of your clipboard by typing xsel by itself with no arguments:
This command requires you to install the xsel utility which is free
Built-in function in linux, should work on any distro
Bulit-in function in linux, so should work on any linux distribution.
You could start this one with
for f in *; do
BUT using the find with "-type f" ensures you only get files not any dirs you might have
It'll also create backups of the files it's overwriting
Of course, this assumes that you don't have any files with duplicated filenames in your target structure
Generates a frequency sweep from $x to $y, with $d numbers inbetween each step, and with each tone lasting $l milliseconds.
Found in comments section works on most Linux flavors.
Shows a list of running virtual machines on a vmware host (workstation/server/esx/etc.)
Again, this command is vmware-specific.
There are also other things you can do with `vmrun`. Just simply type vmrun by itself (no arguments) to get a readout of other things you can do with it.
There is no output from this command. The command boots a virtual machine and you will have to wait for the boot sequence to complete before you can ping or connect to the virtual machine via ssh/rdp/vnc/nx/etc.
To check if the table-of-content in a LaTeX document is up-to-date, copy it to a backup before running LaTeX and compare the new .toc to the backup. If they are identical, it is updated. If not, you need to run LaTeX again.
LaTeX is not a smart compiler - You need to run it several times to make it back-patch all the missing refs. The message if to do so or not is buried in its endless output and the log file. This grep lines helps to find it.
Puts words on new lines, removing additional newlines.
Simply translates whitespace to newlines. Could be enhanced to compress out extra newlines, but that might be better handled in the next tool down the pipe, with eg uniq(1).
Remove annoying improperly packaged files that untar into the incorrect directory.
Example, When you untar and it extracts hundreds of files into the current directory.... bleh.
Basically it creates a typical word list file from any normal text.