commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Tested with GNU and BSD ls.
Find all files in SVN workspace directories which are uncommitted. List them and find their properties
This is a quick way to find what is hogging disk space when you get a full disk alert on your
monitoring system. This won't work as is with filesystems that allow embedded spaces in user
names or groups (read "Mac OS X attached to a Windows Domain"). In those cases, you will need to change the -k 5 to something that works in your situation.
forgot to use a pv or rsync and want to know how much has been copied.
Provides a recursive time ordered list of the current directory over the last 3 minutes.
Excluding zero byte files:
ls -lF -darth `find . -size +0 -mmin -3`
For the last day's files, change "-mmin -3" to "-mtime -1":
ls -lF -darth `find . -size +0 -mtime -1`
shows you the symlinks in the current directory, recursively, but without following them
xargs will automatically determine how namy args are too many and only pass a reasonable number of them at a time. In the example, 500,002 file names were split across 26 instantiations of the command "echo".
Very quick! Based only on the content sizes and the character counts of filenames. If both numbers are equal then two (or more) directories seem to be most likely identical.
if in doubt apply:
diff -rq path_to_dir1 path_to_dir2
AWK function taken from here:
displays a list of all file extensions in current directory and how many files there are of each type of extension in ascending order (case insensitive)
Sometimes there are just no variables such as $DESKTOP_SESSION, $GDMSESSION, or $WINDOWMANAGER.
Old drive with lots of music or unsorted drive? This command will play all mp3 files in a folder and after playing one song or pressing q, it will ask you if you want to delete the file.
You can omit the -d to see what's inside directories. In that case, you may want -a to see dotfiles inside those directories. (Otherwise you don't need -a since you're explicitly looking at them.)
alias lst="ls -ls -tr | tail"
Find all files larger than 500M in home directory and print them ordered by size with full info about each file.
This one works without an external program (watch). Witch is not await able all the time. HINT: use CTRL + C to exit the loop.