commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
To understand why this is the equivalent of "find -L /path/to/search -type l, see http://ynform.org/w/Pub/FindBrokenSymbolicLinks or look at http://www.gnu.org/software/findutils/manual/html_mono/find.html
From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd.
A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive.
Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?
Sometimes my /var/cache/pacman/pkg directory gets quite big in size. If that happens I run this command to remove old package files. Packages that we're upgraded in last N days are kept in case you are forced to downgrade a specific package. The command is obviously Arch Linux related.
Searches for *.cpp and *.h in directory structure, counts the number of lines for each matching file and adds the counts together.
Probably neither faster nor better than -delete in find. It's just that I generally dislike teaching find builtin actions.
Also shows files as they are found. Only works from a tty.
In case you ever got to many arguments using rm to delete multiple files matching a pattern this will help you
touch -t 201208211200 first ; touch -t 201208220100 last ;
creates 2 files: first & last, with timestamps that the find command should look between:
201208211200 = 2012-08-21 12:00
201208220100 = 2012-08-22 01:00
then we run find command with "-newer" switch, that finds by comparing timestamp against a reference file:
find /path/to/files/ -newer first ! -newer last
meaning: find any files in /path/to/files that are newer than file "first" and not newer than file "last"
pipe the output of this find command through xargs to a move command:
| xargs -ifile mv -fv file /path/to/destination/
and finally, remove the reference files we created for this operation:
rm first; rm last;
This command allow you quick find any executable by keyword(s) in your system.
NOTE: Sometime this command will output like this:
`hello.py.launch': No such file or directory
this is normal behaviour
This is useful when you are uploading svn project files to a new git repo.
This command finds all the files whose status has changed between the ctime of the older and newer .
Very useful if you can see from an ls listing a block of consecutive files you want to move or delete, but can't figure out exactly the time range by date.
These should be a little faster since they don't have to spawn grep.
My most used bash function without a doubt!
Allows to change 'shell' compatible files execution bit even if their name is not *.sh
An example of this command that includes the -name arg.