All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Virtualbox: setup hardware
where - memory 256 assign 256 Mb RAM - acpi on enable ACPI (mandatory if you use Winfog 2000 - ioapic off disable the IO APIC. Not useful if you use one CPU (on virtual machine or a 32 bit operative system). As ACPI, this switch is mandatory for Winbug 2000 - pae on enable the Phisical Address Extension how to use more than 4Gb of RAM on x86 CPU - hwvirtex on enables hardware virtualization extensions for microprocessors that have this feature (which should be also enabled in the BIOS of the motherboard) - nestedpaging on allows part of the processes of memory management hardware are made directly

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Count the total amount of hours of your music collection
First the find command finds all files in your current directory (.). This is piped to xargs to be able to run the next shell pipeline in parallel. The xargs -P argument specifies how many processes you want to run in parallel, you can set this higher than your core count as the duration reading is mainly IO bound. The -print0 and -0 arguments of find and xargs respectively are used to easily handle files with spaces or other special characters. A subshell is executed by xargs to have a shell pipeline for each file that is found by find. This pipeline extracts the duration and converts it to a format easily parsed by awk. ffmpeg reads the file and prints a lot of information about it, grep extracts the duration line. cut and sed cut out the time information, and tr converts the last . to a : to make it easier to split by awk. awk is a specialized programming language for use in shell scripts. Here we use it to split the time elements in 4 variables and add them up.

format txt as table not joining empty columns
-n switch keeps empty columns If your distribution does not ship with a recent column version that supports -n you can use this alternative: perl -pe 's/(^|;);/$1 ;/g' file.csv | column -ts\; | less -S Change the delimiter to your liking.

Show all available cows
There are lots of different cow options to use, this script will show them all

Add directory to $PATH if it's not already there
Sometimes in a script you want to make sure that a directory is in the path, and add it in if it's not already there. In this example, $dir contains the new directory you want to add to the path if it's not already present. There are multiple ways to do this, but this one is a nice clean shell-internal approach. I based it on http://stackoverflow.com/a/1397020. You can also do it using tr to separate the path into lines and grep -x to look for exact matches, like this: $ if ! $(echo "$PATH" | tr ":" "\n" | grep -qx "$dir") ; then PATH=$PATH:$dir ; fi which I got from http://stackoverflow.com/a/5048977. Or replace the "echo | tr" part with a shell parameter expansion, like $ if ! $(echo "${PATH//:/$'\n'}" | grep -qx "$dir") ; then PATH=$PATH:$dir ; fi which I got from http://www.commandlinefu.com/commands/view/3209/. There are also other more regex-y ways to do it, but I find the ones listed here easiest to follow. Note some of this is specific to the bash shell.

Show numerical values for each of the 256 colors in bash
Same as http://www.commandlinefu.com/commands/view/5876, but for bash. This will show a numerical value for each of the 256 colors in bash. Everything in the command is a bash builtin, so it should run on any platform where bash is installed. Prints one color per line. If someone is interested in formatting the output, paste the alternative.

Break lines after, for example 78 characters, but don't break within a word/string
Per default, linux/unix shells are configured with a width of 80 characters. If you like to edit a phrase or string on a line with more than 80 characters it might take long to go there (for example a line with 1000 characters and you like to edit the 98th word which is character 598-603). Maybe you might wish to use 78 characters, because if you forward the text via mail and the text will be quoted (2 extra characters at the beginning to the line "> "), you use 80 characters, otherwise 82, which are lame.

get total of inodes of root partition


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: