Commands by fmuster (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Create a random file of a specific size
This will create a 10 MB file named testfile.txt. Change the count parameter to change the size of the file. As one commenter pointed out, yes /dev/random can be used, but the content doesn't matter if you just need a file of a specific size for testing purposes, which is why I used /dev/zero. The file size is what matters, not the content. It's 10 MB either way. "Random" just referred to "any file - content not specific"

List process in unkillable state D (iowait)

Displaying system temperature
Displaying system temperature your system . shellcode version @ http://inj3ct0r.com/exploits/12554

Gets directory and files tree listing from a FTP-server
Creates a file with contents like `du -a`, only it is remote server filesystem hierarchy. Very usefull then for grep-ing without remote connection.

Testing hard disk reading speed

limit the cdrom driver to a specified speed
this command limit the speed to 8 until next eject of your cdrom disc , can be usefulll when you don't want to listen the sound of your cdrom driver .

A very quick and slick fork bomb. Handle with care. Don't run on production please...

gzip over ssh
I've kept the gzip compression at a low level, but depending on the cpu power available on the source machine you may want to increase it. However, SQL compresses really well, and I found even with -1 I was able to transfer 40 MiB/s over a 100 mbps wire, which was good enough for me.

archive all files containing local changes (svn)
This works more reliable for me ("cut -c 8-" had one more space, so it did not work)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: