All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Setting global redirection of STDERR to STDOUT in a script
You have a script where =ALL= STDERR should be redirected to STDIN and you don't want to add "2>&1" at the end of each command... E.G.: $ ls -al /foo/bar 2>&1 Than just add this piece of code at the beginning of your script! I hope this can help someone. :)

Find number of computers in domain, OU, etc .

Save a copy of all debian packages in the form in which they are installed and configured on your system
A copy of all installed debian packages on your system will be put back together, with all changes in configuration files you made and placed in the current directory. Make sure you have enough disk space (say 2-3 GB). Break any time with Ctrl+C.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

use the real 'rm', distribution brain-damage notwithstanding
The backslash avoids any 'rm' alias that might be present and runs the 'rm' command in $PATH instead. In a misguided attempt to be more "friendly", some Linux distributions (or sites/etc.) alias 'rm' to 'rm -i'. Unfortunately, this trains users to expect that files won't actually be deleted until they okay it. This expectation will fail with catastrophic results when they use other distributions, move to other sites, etc., and doesn't really even work 100% even with the alias. It's too late to fix 'rm', but '\rm' should work everywhere (under bash).

Create tar over SSH
Really useful when out of space in your current machine. You can ran this also with cat for example: $ tar zcvf - /folder/ | ssh root@192.168.0.1 "cat > /dest/folder/file.tar.gz" Or even run other command's: $ tcpdump | ssh root@10.0.0.1 "cat > /tmp/tcpdump.log"

Remove all HTML tags from a file

Mirror a directory structure from websites with an Apache-generated file indexes
wget/curl/friends are not good with mirroring files off websites, especially those with Apache-generated directory listings. These tools endlessly waste time downloading useless index HTML pages. lftp's mirror command does a better job without the mess.

Kill all Zombie processes if they accept it!
Tested on FreeBSD 8.1 and CSH. The scripts works correctly but the Zombies do not die! I hope it will run and function as expected in Linux and others.

computes the most frequent used words of a text file
using $ cat WAR_AND_PEACE_By_LeoTolstoi.txt | tr -cs "[:alnum:]" "\n"| tr "[:lower:]" "[:upper:]" | sort -S16M | uniq -c |sort -nr | cat -n | head -n 30 ("sort -S1G" - Linux/GNU sort only) will also do the job but as some drawbacks (caused by space/time complexity of sorting) for bigger files...


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: