All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Generate list of words and their frequencies in a text file.

Create a directory and change into it at the same time
How often do you make a directory (or series of directories) and then change into it to do whatever? 99% of the time that is what I do. This BASH function 'md' will make the directory path then immediately change to the new directory. By using the 'mkdir -p' switch, the intermediate directories are created as well if they do not exist.

Grab a list of MP3s out of Firefox's cache
Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Search some text from all files inside a directory

create thumbnail of pdf

Execute a command on logout
Execute a command on shell logout,and run it until is finished,then shell is closed.

Download an entire website
-p parameter tells wget to include all files, including images. -e robots=off you don't want wget to obey by the robots.txt file -U mozilla as your browsers identity. --random-wait to let wget chose a random number of seconds to wait, avoid get into black list. Other Useful wget Parameters: --limit-rate=20k limits the rate at which it downloads files. -b continues wget after logging out. -o $HOME/wget_log.txt logs the output

DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree
This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories). Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd. md5sum can be substituted with sha1sum without problems. The actual filename is not taken into account-just the hash is used. Whatever sort thinks is the first filename is kept. It is assumed that the filename does not contain 0x00. As per the good suggestion in the first comment, this one does a hard link instead: $ find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: