commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Can easily be scripted in order to show permission "tree" from any folder. Can also be formated with
column -t
{ pushd .> /dev/null; cd /; for d in `echo $OLDPWD | sed -e 's/\// /g'`; do cd $d; echo -n "$d "; ls -ld .; done; popd >/dev/null ; } | column -t
from http://www.commandlinefu.com/commands/view/3731/using-column-to-format-a-directory-listing
plays with bash arrays. instead of storing the list of files in a temp file, this stores the list in ram, retrieves the last element in the array (the last html file), then removes it.
-d: list directory entries instead of contents, and do not dereference symbolic links
No need for -l and the output can be sent directly into another function expecting directory names.
Comments can be used directly on the command line so I can save in the history a brief description of what command does.
List files and pass to openssl to calculate the hash for each file.
Works 99.9% of the time; so far never required a more complex expression in manual input.
Although rm is protected against it, there are many commands that would wreak havoc on entering the obvious ".*" to address "dot-files". This sweet little expression excludes the dirs "." and ".." that cause the problems.
I find the ouput of ls -lR to be un-satisfying (why is the path data up there?) and find syntax to be awkward. Running 'du -a' means you will have likely to trim-off filesize data before feeding filenames to the next step in the pipe.
I have a directory containing log files. This command delete all but the 5 latest logs. Here is how it works:
* The ls -t command list all files with the latest ones at the top
* The awk's expression means: for those lines greater than 5, delete.
This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.
shows also time if its the same year or shows year if installed before actual year and also works if /etc is a link (mac os)
use manpages, they give you "ultimate commands"
"ls -SshF --color" list by filesize (biggest at the top)
"ls -SshFr --color" list by filesize in reverse order (biggest at the bottom)
Compile *.c files with "gcc -Wall" in actual directory, using as output file the file name without extension.
1. find file greater than 10 MB
2. direct it to xargs
3. xargs pass them as argument to ls
use the locate command to find files on the system and verify they exist (-e) then display each one in full details.