commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
I had some trouble removing empty lines from a file (perhaps due to utf-8, as it's the source of all evil), \W did the trick eventually.
Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states
This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.
find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of
...|sort|uniq$
to filter them.
Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.
This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}
This expression looks for groups inside of a GroupOfNames class element, that is itself inside one (or many) Organizational Unit (ou) nodes in the ldap tree. Give you a quick dump of all the groups the user belongs to. Handy for displaying on a webpage.
Finds all corrupted jpeg files in current directory and its subdirectories. Displays the error or warning found.
The jpeginfo is part of the jpeginfo package in debian.
Should you wish to only get corrupted filenames, use cut to extract them :
find ./ -name *jpg -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR" | cut -d " " -f 1
Lists revisions in a Subversion repository with a timestamp that doesn't follow the revision numbering order. If everything is OK, nothing is displayed.
Works in Ubuntu, I hope it will work on all Linux machines. For Unixes, tail should be capable of handling more than one file with '-f' option.
This command line simply take log files which are text files, and not ending with a number, and it will continuously monitor those files.
Putting one alias in .profile will be more useful.
The variable WIRELESSINTERFACE indicates your wireless interface
Note that this assumes the application is an SVN checkout and so we have to throw away all the .svn files before making the substitution.
exported files will get a .r23 extension (where 23 is the revision number)
Just find out the daemon with $ netstat -atulpe. Then type in his name and he gets the SIGTERM.
will display typedefs, structs, unions and functions declared in 'stdio.h'(checkout _IO_FILE structure). It will be helpful if we want to know what a particular header file will offer to us. Command 'cpp' is GNU's C Preprocessor.
Tuned for short command line - you can set the path to sessionstore.js more reliable instead of use asterixes etc.
Usable when you are not at home and really need to get your actual opened tabs on your home computer (via SSH). I am using it from my work if I forgot to bookmark some new interesting webpage, which I have visited at home. Also other way to list tabs when your firefox has crashed (restoring of tabs doesn't work always).
This script includes also tabs which has been closed short time before.
The curl command retrieve the HTML text containing the IP address. The grep command picks out the IP address from that HTML text.
Get your colorized grep output in less(1). This involves two things: forcing grep to output colors even though it's not going to a terminal and telling less to handle those properly.