commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This command deletes all files in all subfolders if their name or path contains "deleteme".
To dry-run the command without actually deleting files run:
find . | grep deleteme | while read line; do echo rm $line; done
If you need to delete all redundant ".svn" directories from a given path and all its subdirectories, use this command !
Particulary useful if you want to upload to an ftp server, but don't use svn or if you need to update/backup some source code to another directory.
You can also try "svn export . /new/path/without/svn/dirs" (also from the CLI)
For a python project, sometimes I need to clean all the compiled python files. I have an alias 'rmpyc' to this command. This really saves me a lot of typing and hunting throughout the folders to delete those files.
This command will remove all .svn folder from your project if you need to manual remove the subversion files.
Copy data to the destination using commands such as cpio (recommended), tar, rsync, ufsdump, or ufsrestore.
Let the source directory be /source, and let the destination directory be /destination.
# cd /source
# cd ..
# find ./source -depth -print | cpio -cvo> /destination/source_data.cpio
# cd /destination
# cpio -icvmdI ./source_data.cpio
# rm -rf ./source_data.cpio
deletes logs not modified in over [#] days - modify to compress or move, as needed
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.