Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using grep from sorted by
Terminal - Commands using grep - 1,583 results
some_cronjobed_script.sh 2>&1 | tee -a output.log | grep -C 1000 ERROR
2009-03-06 17:51:13
User: DEinspanjer
Functions: grep tee
Tags: Linux
-1

The large context number (-C 1000) is a bit of a hack, but in most of my use cases, it makes sure I'll see the whole log output.

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}' OR USE ALTERNATE WAY awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'
2009-03-05 07:28:26
User: rommelsharma
Functions: awk grep
-3

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}'

OR

awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'

In this example, we are having a text file that is having several entries like:

---

c1 c2 c3 c4

this is some data

HOME /dir1/dir2/.../dirN/somefile1.xml

HOME /dir1/dir2/somefile2.xml

some more data

---

for lines starting with HOME, we are extracting the second field that is a 'file path with file name', and from that we need to get the filename only and ignore the slash delimited path.

The output would be:

somefile1.xml

somefile2.xml

(In case you give a -ive - pls give the reasons as well and enlighten the souls :-) )

grep -r --exclude-dir=.svn PATTERN PATH
2009-03-04 23:21:50
User: patko
Functions: grep
Tags: svn
8

exclude-dir option requires grep 2.5.3

wget -c -v -S -T 100 --tries=0 `curl -s http://ms1.espectador.com/ podcast/espectador/la_venganza_sera_terrible.xml | grep -v xml | grep link | sed 's/]*>//g'`
2009-03-04 13:12:28
User: fmdlc
Functions: grep link sed wget
-3

This download a complete audio podcast

cal | grep --before-context 6 --after-context 6 --color -e " $(date +%e)" -e "^$(date +%e)"
2009-03-04 06:46:52
User: haivu
Functions: cal grep
Tags: PIM
3

Explanation:

* The date command evaluated to today's date with blank padded on the left if single digit

* The grep command search and highlight today's date

* The --before-context and --after-context flags displays up to 6 lines before and after the line containing today's date; thus completes the calendar.

I have tested this command on Mac OS X Leopard and Xubuntu 8.10

$ grep -rl oldstring . |xargs sed -i -e 's/oldstring/newstring/'
2009-03-03 20:10:19
User: netfortius
Functions: grep sed
Tags: perl sed
25

recursively traverse the directory structure from . down, look for string "oldstring" in all files, and replace it with "newstring", wherever found

also:

grep -rl oldstring . |xargs perl -pi~ -e 's/oldstring/newstring'
gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less
export IFS=$'\n';for dir in $( ls -l | grep ^d | cut -c 52-);do du -sh $dir; done
ps axww | grep SomeCommand | awk '{ print $1 }' | xargs kill
2009-02-28 17:48:51
User: philiph
Functions: awk grep ps xargs
-7

This command kills all processes with 'SomeCommand' in the process name. There are other more elegant ways to extract the process names from ps but they are hard to remember and not portable across platforms. Use this command with caution as you could accidentally kill other matching processes!

xargs is particularly handy in this case because it makes it easy to feed the process IDs to kill and it also ensures that you don't try to feed too many PIDs to kill at once and overflow the command-line buffer.

Note that if you are attempting to kill many thousands of runaway processes at once you should use 'kill -9'. Otherwise the system will try to bring each process into memory before killing it and you could run out of memory. Typically when you want to kill many processes at once it is because you are already in a low memory situation so if you don't 'kill -9' you will make things worse

svn status | grep '^\?' | sed -e 's/^\?//g' | xargs svn add
2009-02-28 03:00:28
User: dollyaswin
Functions: grep sed xargs
0

These part of the command:

svn status | grep '^\?' => find new file or directory on working copy

sed -e 's/^\?//g' => remove "^" character on the first character of file name

xargs svn add => add file to subversion repository

You can modify above command to other circumtances, like revert addition files or commit files that have been modified. ^_^

aptitude show $PROGRAM | grep Vers
2009-02-27 23:24:37
User: aabilio
Functions: grep
-1

Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers

Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.

grep "FOUND" /var/log/squidclamav.log | awk '{print $5"-"$2"-"$3","$4","$11}' | sed -e 's/\,http.*url=/\,/g' | sed -e 's/&/\,/g' | sed -e 's/source=//g' |sed -e 's/user=//g' | sed -e 's/virus=//g' | sed -e 's/stream\:+//g' | sed -e 's/\+FOUND//g'
2009-02-27 13:28:18
User: nablas
Functions: awk grep sed
0

This command will list a CSV list of infected files detected by clamav through squidclamav redirector.

ls -l | grep ^d
2009-02-26 20:28:10
User: sysadmn
Functions: grep ls
1

Show only the subdirectories in the current directory. In the example above, /lib has 135 files and directories. With this command, the 9 dirs jump out.

rpm -qa | grep PACKAGENAME | xargs rpm -q --filesbypkg
2009-02-26 14:32:12
User: piscue
Functions: grep rpm xargs
1

rpm, sometimes, is not wildcard friendly. To search files installed from package this could be useful.

change PACKAGENAME to any package do you want to search

dmidecode | grep -i prod
2009-02-25 23:05:17
User: rockon
Functions: grep
6

This command gives a model information of a computer. Also useful in determining the host is a VM machine or actual physical machine.

IPADDR=`ifconfig eth0 | grep -i inet | awk -F: '{print $2}'| awk '{print $1}'`
2009-02-25 22:58:19
User: rockon
Functions: awk grep
0

Useful in scripts while you just need an IP address in a variable.

svn st | grep ^? | xargs svn add 2> /dev/null
ps -ef | grep [t]clsh
on="off"; off="on"; now=$(amixer get Master | tr -d '[]' | grep "Playback.*%" |head -n1 |awk '{print $7}'); amixer sset Master ${!now}
grep --color=auto -iRnH "$search_word" $directory
2009-02-21 19:16:33
User: tobiasboon
Functions: grep
12

greps for search word in directory and below (defaults to cd).

-i case insensitive

-n shows line number

-H shows file name

find /path/to/files -type f -mtime +7 | grep -v \.gz | xargs gzip
N="filepath" ; P=/proc/$(lsof +L1 | grep "$N" | awk '{print $2}')/fd ; ls -l $P | sed -rn "/$N/s/.*([0-9]+) ->.*/\1/p" | xargs -I_ cat $P/_ > "$N"
2009-02-21 02:31:24
User: laburu
Functions: awk cat grep ls sed xargs
5

Note that the file at the given path will have the contents of the (still) deleted file, but it is a new file with a new node number; in other words, this restores the data, but it does not actually "undelete" the old file.

I posted a function declaration encapsulating this functionality to http://www.reddit.com/r/programming/comments/7yx6f/how_to_undelete_any_open_deleted_file_in_linux/c07sqwe (please excuse the crap formatting).

for i in `ps aux | grep ssh | grep -v grep | awk {'print $2'}` ; do kill $i; done
HOST=127.0.0.1;for((port=1;port<=65535;++port)); do echo -en "$port ";if echo -en "open $HOST $port\nlogout\quit" | telnet 2>/dev/null | grep 'Connected to' > /dev/null; then echo -en "\n\nport $port/tcp is open\n\n";fi;done | grep open
ps auxwww | grep outofcontrolprocess | awk '{print $9}' | xargs kill -9