commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
grep - Search file for character string
Search for one or more strings in one or more files. Examples:-
grep that myfile.txt
Look for the string ``that'' in the file called ``myfile.txt'' and print out each line that matches.
egrep -in "this|that" *.dat
Extended grep search *.dat files for ``this'' or ``that'' case insensitive (-i) and where found print line number (-n) along with the line contents.
If all three Session IDs are the same, then you've got SSL session caching running.
With a couple of little commands, you?ll be able to ignore the .DS_Store files forever from your git repositories on mac!
The following command will add the .gitignore file to the git configuration
git config --global core.excludesfile ~/.gitignore
then, the following, will add the .DS_Store to the list
echo .DS_Store >> ~/.gitignore
Using sed to extract lines in a text file
If you write bash scripts a lot, you are bound to run into a situation where you want to extract some lines from a file. Yesterday, I needed to extract the first line of a file, say named somefile.txt.
This specific task can be easily done with this:
head -1 somefile.txt
For a more complicated task, like extract the second to third lines of a file. head is inadequate.
So, let's try extracting lines using sed: the stream editor.
My first attempt uses the p sed command (for print):
sed 1p somefile.txt
Note that it prints the whole file, with the first line printed twice. Why? The default output behavior is to print every line of the input file stream.
The explicit 1p command just tells it to print the first line .... again.
To fix it, you need to suppress the default output (using -n), making explicit prints the only way to print to default output.
sed -n 1p somefile.txt
Alternatively, you can tell sed to delete all but the first line.
sed '1!d' somefile.txt
'1!d' means if a line is not(!) the first line, delete.
Note that the single quotes are necessary. Otherwise, the !d will bring back the last command you executed that starts with the letter d.
To extract a range of lines, say lines 2 to 4, you can execute either of the following:
sed -n 2,4p somefile.txt
sed '2,4!d' somefile.txt
Note that the comma specifies a range (from the line before the comma to the line after).
What if the lines you want to extract are not in sequence, say lines 1 to 2, and line 4?
sed -n -e 1,2p -e 4p somefile.txt
deletes line 3 in known_hosts text file
The output will likely point to '/etc/alternatives/java'.
So find out where that points by issuing ls -l like this:
ls -l /etc/alternatives/java
I cannot run Tomcat from Eclipse. It says that there?s other process that is running on port 8080, but I don?t know what is the process, and how to stop it from the Services manger in Windows. So here?s how you can kill and find out what is that process:
To find out what PID 8080 was (hopefully not a trojan)
I typed tasklist /FI ?PID eq 8080″
taskkill /F /PID 2600