All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

ls -hog --> a more compact ls -l
I often deal with long file names and the 'ls -l' command leaves very little room for file names. An alternative is to use the -h -o and -g flags (or together, -hog). * The -h flag produces human-readable file size (e.g. 91K instead of 92728) * The -o suppresses the owner column * The -g suppresses the group column Since I use to alias ll='ls -l', I now do alias ll='ls -hog'

Recover a deleted file
grep searches through a file and prints out all the lines that match some pattern. Here, the pattern is some string that is known to be in the deleted file. The more specific this string can be, the better. The file being searched by grep (/dev/sda1) is the partition of the hard drive the deleted file used to reside in. The ?-a? flag tells grep to treat the hard drive partition, which is actually a binary file, as text. Since recovering the entire file would be nice instead of just the lines that are already known, context control is used. The flags ?-B 25 -A 100? tell grep to print out 25 lines before a match and 100 lines after a match. Be conservative with estimates on these numbers to ensure the entire file is included (when in doubt, guess bigger numbers). Excess data is easy to trim out of results, but if you find yourself with a truncated or incomplete file, you need to do this all over again. Finally, the ?> results.txt? instructs the computer to store the output of grep in a file called results.txt. Source: http://spin.atomicobject.com/2010/08/18/undelete?utm_source=y-combinator&utm_medium=social-media&utm_campaign=technical

Print IP of any interface. Useful for scripts.

An easter egg built into python to give you the Zen of Python

Copy files for backup storage
Backup a whole directory copying only updated files.

Find the package that installed a command

Quickly analyse an Apache error log
This searches the Apache error_log for each of the 5 most significant Apache error levels, if any are found the date is then cut from the output in order to sort then print the most common occurrence of each error.

network usage per process
Nethogs groups bandwidth by process.

Sort all running processes by their memory & CPU usage
you can also pipe it to "tail" command to show 10 most memory using processes.

Generate SHA1 hash for each file in a list
All output is placed in file SHA1SUMS which you can later check with 'sha1sum --check'. Works on most Linux distros where 'sha1sum' is installed.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: