All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Show Directories in the PATH Which does NOT Exist

Create directory named after current date
Not a discovery but a useful one nontheless. In the above example date format is 'yyyymmdd'. For other possible formats see 'man date'. This command can be also very convenient when aliased to some meaningful name: $ alias mkdd='mkdir $(date +%Y%m%d)'

Print all the lines between 10 and 20 of a file
Similarly, if you want to print from 10 to the end of line you can use: sed -n '10,$p' filename This is especially useful if you are dealing with a large file. Sometimes you just want to extract a sample without opening the entire file. Credit goes to wbx & robert at the comments section of http://www.commandlinefu.com/commands/view/348/get-line1000-from-text.#comment

Go to the Nth line of file

count the number of specific characters in a file or text stream
In this example, the command will recursively find files (-type f) under /some/path, where the path ends in .mp3, case insensitive (-iregex). It will then output a single line of output (-print0), with results terminated by a the null character (octal 000). Suitable for piping to xargs -0. This type of output avoids issues with garbage in paths, like unclosed quotes. The tr command then strips away everything but the null chars, finally piping to wc -c, to get a character count. I have found this very useful, to verify one is getting the right number of before you actually process the results through xargs or similar. Yes, one can issue the find without the -print0 and use wc -l, however if you want to be 1000% sure your find command is giving you the expected number of results, this is a simple way to check. The approach can be made in to a function and then included in .bashrc or similar. e.g. $ count_chars() { tr -d -c "$1" | wc -c; } In this form it provides a versatile character counter of text streams :)

'readlink' equivalent using shell commands, and following all links
This is a equivalent to the GNU ' readlink' tool, but it supports following all the links, even in different directories. An interesting alternative is this one, that gets the path of the destination file $ myreadlink() { [ ! -h "$1" ] && echo "$1" || (local link="$(expr "$(command ls -ld -- "$1")" : '.*-> \(.*\)$')"; cd $(dirname $1); myreadlink "$link" | sed "s|^\([^/].*\)\$|$(dirname $1)/\1|"); }

send echo to socket network
Using netcat, usuallly installed on debian/ubuntu. Also to test against a sample server the following two commands may help echo got milk? | netcat -l -p 25 python -c "import SocketServer; SocketServer.BaseRequestHandler.handle = lambda self: self.request.send('got milk?\n'); SocketServer.TCPServer(('0.0.0.0', 25), SocketServer.BaseRequestHandler).serve_forever()"

df output, sorted by Use% and correctly maintaining header row
Show disk space info, grepping out the uninteresting ones beginning with ^none while we're at it. The main point of this submission is the way it maintains the header row with the command grouping, by removing it from the pipeline before it gets fed into the sort command. (I'm surprised sort doesn't have an option to skip a header row, actually..) It took me a while to work out how to do this, I thought of it as I was drifting off to sleep last night!

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: