All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Search for a string inside all files in the current directory
This is how I typically grep. -R recurse into subdirectories, -n show line numbers of matches, -i ignore case, -s suppress "doesn't exist" and "can't read" messages, -I ignore binary files (technically, process them as having no matches, important for showing inverted results with -v) I have grep aliased to "grep --color=auto" as well, but that's a matter of formatting not function.

Extract neatly a rar compressed file
It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!

GREP a PDF file.
This is a good alternative to pdf2text for Ubuntu. To install it: sudo apt-get install python-pdfminer

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Download all music files off of a website using wget
This will download all files of the type specified after "-A" from a website. Here is a breakdown of the options: -r turns on recursion and downloads all links on page -l1 goes only one level of links into the page(this is really important when using -r) -H spans domains meaning it will download links to sites that don't have the same domain -nd means put all the downloads in the current directory instead of making all the directories in the path -A mp3 filters to only download links that are mp3s(this can be a comma separated list of different file formats to search for multiple types) -e robots=off just means to ignore the robots.txt file which stops programs like wget from crashing the site... sorry http://example/url lol..

IP addresses connected to port 80
IP addresses and number of connections connected to port 80.

Grab the first 3 octets of your ip addresses
For machines that have many ip blocks spanning different Class C's, this will show which ones.

Extract title from HTML files
This command can be used to extract the title defined in HTML pages

Virtualbox rsync copy (without defining any virtualbox configuration)
That is, after running `vagrant ssh-config` to determine ports and ip's: $ vagrant ssh-config Host default HostName 127.0.0.1 User vagrant Port 2200 UserKnownHostsFile /dev/null StrictHostKeyChecking no PasswordAuthentication no IdentityFile /Users/romanvg/tmp/.vagrant/machines/default/virtualbox/private_key IdentitiesOnly yes LogLevel FATAL

cat a file backwards
Or "tail -r" on Solaris.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: