All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Recursively remove .svn directories from a local repository

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Broadcast your shell thru ports 5000, 5001, 5002 ...
run 'nc yourip 5000', 'nc yourip 5001' or 'nc yourip 5002' elsewhere will produce an exact same mirror of your shell. This is handy when you want to show someone else some amazing stuff in your shell without giving them control over it.

add files to existing growable DVD using growisofs
replace "directory name with files to add to DVD" with actual directory containing files you want to add to growable DVD

clone directory structure
dir1 and all its subdirs and subdirs of subdirs ... but *no files* will be copied to dir2 (not even symbolic links of files will be made). To preserve ownerships & permissions: $ cp -Rps dir1 dir2 Yes, you can do it with $ rsync -a --include '*/' --exclude '*' /path/to/source /path/to/dest too, but I didn't test if this can handle attributes correctly (experiment rsync command yourself with --dry-run switch to avoid harming your file system) You must be in the parent directory of dir1 while executing this command (place dir2 where you will), else soft links of files in dir2 will be made. I couldn't find how to avoid this "limitation" (yet). Playing with recursive unlink command loop maybe? PS. Bash will complain, but the job will be done.

pretend to be busy in office to enjoy a cup of coffee
Not as taxing on the CPU.

DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree
This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories). Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd. md5sum can be substituted with sha1sum without problems. The actual filename is not taken into account-just the hash is used. Whatever sort thinks is the first filename is kept. It is assumed that the filename does not contain 0x00. As per the good suggestion in the first comment, this one does a hard link instead: $ find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'

list block devices
Shows all block devices in a tree with descruptions of what they are.

Listing today’s files only

find duplicate messages in a Maildir
# find assumes email files start with a number 1-9 # sed joins the lines starting with " " to the previous line # gawk print the received and from lines # sort according to the second field (received+from) # uniq print the duplicated filename # a message is viewed as duplicate if it is received at the same time as another message, and from the same person. The command was intended to be run under cron. If run in a terminal, mutt can be used: mutt -e "push otD~=xq" -f $folder


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: