All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find the package that installed a command

Show simple disk IO table using snmp
Show a simple table with disk IO for the specified host. you monitor a LOT of different thing. Mostly used for MRTG and similar, but this is nice for a quick look, which disk is busy. "public" is your SNMP community ensure that snmpd is running on the host which you intend to monitor

Change the window title of your xterm
If you are using an xterm emulation capable terminal emulator, such as PuTTY or xterm on Linux desktop, this command will replace the title of that terminal window. I know it is not nice to have seventeen terminals on your desktop with title PuTTY, you can not tell which one is connected to which server and doing what. Even though the string between the quotes is typed as literals, it needs a little more finesse to make it work. Here is how it is done key-by-key: echo "( ctrl-v then ctrl-[ )0;Enter_Title_String_Here( ctrl-v then ctrl-g )"( enter ) ctrl-v : means hold down ctrl key and hit v at the same time like you are pasting in windoze ; also please don't type the parentheses, i.e., ( and )

"Pretty print" $PATH, separate path per line
from: http://www.unix.com/shell-programming-and-scripting/28047-split-print-path.html

Create incremental backups of individual folders using find and tar-gzip
Problem: I wanted to backup user data individually, using and incremental method. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...) I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_2013-12-05.tgz" and "lucy_2013-12-05.tgz".

Create a continuous digital clock in Linux terminal
Same effect, only shell commands.

Copy structure
Clone directory structure without the files

subtraction between lines
It's allways strange for me to see sed and awk in the same command line if you can avoid it

Press Any Key to Continue
Halt script progress until a key has been pressed. Source: http://bash-hackers.org/wiki/doku.php/mirroring/bashfaq/065

Using bash inline
There are two ways to use "here documents" with bash to fill stdin: The following example shows use with the "bc" command. a) Using a delimiter at the end of data: $ less-than less-than eeooff bc > k=1024 > m=k*k > g=k*m > g > eeooff 1073741824 b) using the "inline" verion with three less-than symbols: $ less-than less-than less-than "k=1024; m=k*k; g=k*m; g" bc 1073741824 One nice advantage of using the triple less-than version is that the command can easily be recalled from command line history and re-executed. PS: in this "description", I had to use the name "less-than" to represent the less-than symbol because the commandlinefu input text box seems to eat up the real less-than symbols. Odd.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: