Commands using grep (1,935)

  • I found this command on a different site and thought you guy might enjoy it. Just change "YOURSEARCH" to what ever you want to search. Example, "Linux Commands"


    9
    Q="YOURSEARCH"; GOOG_URL="http://www.google.com/search?q="; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}\"${Q/\ /+}\"" | grep -oP '\/url\?q=.+?&amp' | sed 's/\/url?q=//;s/&amp//'); echo -e "${stream//\%/\x}"
    techie · 2013-04-03 09:56:41 12
  • example: -------------------------------------------------------------------------------------------- user@ubuntu:~/workspace/SVN/haystak-repos/trunk/internal/src$ addpi -------------------------------------------------------------------------------------------- Now that directory is in the list of fast access directories. You can switch to it anytime like this: -------------------------------------------------------------------------------------------- user@ubuntu:~$ pi internal` user@ubuntu:~/workspace/SVN/haystak-repos/trunk/internal/src$ -------------------------------------------------------------------------------------------- Please note the backquote ( the symbol that shares its key with ~ in the keyboard ) pi will switch you to that directory. To see the list of all fast access directories you have to say "cat ~/.pi"


    8
    alias pi='`cat ~/.pi | grep ' ; alias addpi='echo "cd `pwd`" >> ~/.pi'
    senthil · 2009-02-05 15:46:59 30

  • 8
    svn st | grep "^\?" | awk "{print \$2}" | xargs svn add $1
    mk · 2009-02-05 17:28:53 31

  • 8
    gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less
    eanx · 2009-03-03 04:26:57 16
  • exclude-dir option requires grep 2.5.3


    8
    grep -r --exclude-dir=.svn PATTERN PATH
    patko · 2009-03-04 23:21:50 7
  • Suppose you made a backup of your hard disk with dd: dd if=/dev/sda of=/mnt/disk/backup.img This command enables you to mount a partition from inside this image, so you can access your files directly. Substitute PARTITION=1 with the number of the partition you want to mount (returned from sfdisk -d yourfile.img). Show Sample Output


    8
    INFILE=/path/to/your/backup.img; MOUNTPT=/mnt/foo; PARTITION=1; mount "$INFILE" "$MOUNTPT" -o loop,offset=$[ `/sbin/sfdisk -d "$INFILE" | grep "start=" | head -n $PARTITION | tail -n1 | sed 's/.*start=[ ]*//' | sed 's/,.*//'` * 512 ]
    Alanceil · 2009-03-06 21:29:13 11
  • the good: Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.4 with Suhosin-Patch the bad: Server: Microsoft-IIS/6.0 and the ugly: Server: Apache/2.2.10 (Win32) mod_ssl/2.2.10 OpenSSL/0.9.8i PHP/5.2.6


    8
    wget -S -O/dev/null "INSERT_URL_HERE" 2>&1 | grep Server
    asmoore82 · 2009-03-09 06:54:54 7
  • Searches the /var/log/secure log file for Failed and/or invalid user log in attempts. Show Sample Output


    8
    cat /var/log/secure | grep sshd | grep Failed | sed 's/invalid//' | sed 's/user//' | awk '{print $11}' | sort | uniq -c | sort -n
    empulse · 2009-03-30 15:48:24 18
  • By putting the "-not \( -name .svn -prune \)" in the very front of the "find" command, you eliminate the .svn directories in your find command itself. No need to grep them out. You can even create an alias for this command: alias svn_find="find . -not \( -name .svn -prune \)" Now you can do things like svn_find -mtime -3


    8
    find . -not \( -name .svn -prune \) -type f -print0 | xargs --null grep <searchTerm>
    qazwart · 2009-07-08 20:08:05 10
  • Connect to a machine running ssh using mac address by using the "arp" command Show Sample Output


    8
    ssh root@`for ((i=100; i<=110; i++));do arp -a 192.168.1.$i; done | grep 00:35:cf:56:b2:2g | awk '{print $2}' | sed -e 's/(//' -e 's/)//'`
    gean01 · 2009-09-09 04:32:20 15

  • 8
    curl -s checkip.dyndns.org | grep -Eo '[0-9\.]+'
    hugin · 2009-10-26 09:15:31 3
  • -R, -r, --recursive Read all files under each directory, recursively; this is equivalent to the -d recurse option.


    8
    grep -r -i "phrase" directory/
    TheFox · 2010-01-26 16:27:00 8
  • Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file. Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.


    8
    export QQ=$(mktemp -d);(cd $QQ; curl -s -O http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25];for i in $(perl -ne 'print "$1\n" if( /^(\w+\(\))/ )' *|sort -u);do grep -h -m1 -B1 $i *; done)|grep -v '^--' > clf.sh;rm -r $QQ
    bartonski · 2010-01-30 19:47:42 40

  • 8
    grep -hIr :name ~/.mozilla/firefox/*.default/extensions | tr '<>=' '"""' | cut -f3 -d'"' | sort -u
    whiskybar · 2010-05-13 15:59:51 5
  • This should do the same thing and is about 70 chars shorter. Show Sample Output


    8
    aptitude remove $(dpkg -l|egrep '^ii linux-(im|he)'|awk '{print $2}'|grep -v `uname -r`)
    dbbolton · 2010-06-10 21:23:00 10

  • 8
    ls |tee >(grep xxx |wc >xxx.count) >(grep yyy |wc >yyy.count) |grep zzz |wc >zzz.count
    nottings · 2010-10-25 17:49:14 3
  • Trac 0.12.2-stable Show Sample Output


    8
    grep --include=*.py -lir "delete" .
    evandrix · 2011-08-17 13:18:43 7
  • Someone over at Mozilla dot Org probably said, "I know, let's create a super-duper universal replacement for browser cookies that are persistent and even more creepy and then NOT give our browser users the tools they need to monitor, read, block or selectively remove them!" . This will let you see all the DOM object users in all your firefox profiles. Feel free to toss a `| sort -u` on the end to remove dupes. . I highly recommend you treat these as "session cookies" by scripting something that deletes this sqlite database during each firefox start-up. . note: does not do anything for so-called "flash cookies" Show Sample Output


    8
    strings ~/.mozilla/firefox/*/webappsstore.sqlite|grep -Eo "^.+\.:" |rev
    unixmonkey365 · 2011-09-26 15:23:09 10
  • works in bash Show Sample Output


    8
    grep $'\t' sample.txt
    knoppix5 · 2012-02-21 10:54:56 10
  • Certain Flash video players (e.g. Youtube) write their video streams to disk in /tmp/ , but the files are unlinked. i.e. the player creates the file and then immediately deletes the filename (unlinking files in this way makes it hard to find them, and/or ensures their cleanup if the browser or plugin should crash etc.) But as long as the flash plugin's process runs, a file descriptor remains in its /proc/ hierarchy, from which we (and the player) still have access to the file. The method above worked nicely for me when I had 50 tabs open with Youtube videos and didn't want to have to re-download them all with some tool.


    8
    lsof -n -P|grep FlashXX|awk '{ print "/proc/" $2 "/fd/" substr($4, 1, length($4)-1) }'|while read f;do newname=$(exiftool -FileModifyDate -FileType -t -d %Y%m%d%H%M%S $f|cut -f2|tr '\n' '.'|sed 's/\.$//');echo "$f -> $newname";cp $f ~/Vids/$newname;done
    mhs · 2012-02-25 01:49:45 5
  • Uses GNU Parallel. Show Sample Output


    8
    timeDNS() { parallel -j0 --tag dig @{} "$*" ::: 208.67.222.222 208.67.220.220 198.153.192.1 198.153.194.1 156.154.70.1 156.154.71.1 8.8.8.8 8.8.4.4 | grep Query | sort -nk5; }
    unixmonkey74668 · 2015-04-26 08:22:32 31
  • bash.org is a collection of funny quotes from IRC. WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them... Thanks to Chen for the idea and initial version! This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one. Just barely under the required 255 chars :) Improvment: You can replace the head -1 at the end by: awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt which will separate the quotes with a "%" and place it in the file. and then: strfile bash_quotes.txt which will make the file ready for the fortune command and then you can: fortune bash_quotes.txt which will give you a random quote from those in the downloaded file. I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal. Show Sample Output


    7
    curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
    Iftah · 2009-05-07 13:13:21 17
  • Solves "tr" issues with non C-locales under BSD-like systems (like OS X)


    7
    LC_ALL=C tr -c "[:digit:]" " " < /dev/urandom | dd cbs=$COLUMNS conv=unblock | GREP_COLOR="1;32" grep --color "[^ ]"
    zzambia · 2009-07-02 07:10:33 11
  • Remove newlines from output. One character shorter than awk /./ filename and doesn't use a superfluous cat. To be fair though, I'm pretty sure fraktil was thinking being able to nuke newlines from any command is much more useful than just from one file.


    7
    grep . filename
    TheMightyBuzzard · 2009-08-09 05:33:58 12
  • Returns nothing if the domain exists and 'No match for domain.com' otherwise.


    7
    whois domainnametocheck.com | grep match
    Timothee · 2009-08-11 13:33:25 17
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list files recursively by size

Query wikipedia over DNS

Show your current network interface in use

Redirect incoming traffic to SSH, from a port of your choosing
Stuck behind a restrictive firewall at work, but really jonesing to putty home to your linux box for some colossal cave? Goodness knows I was...but the firewall at work blocked all outbound connections except for ports 80 and 443. (Those were wide open for outbound connections.) So now I putty over port 443 and have my linux box redirect it to port 22 (the SSH port) before it routes it internally. So, my specific command would be: $iptables -t nat -A PREROUTING -p tcp --dport 443 -j REDIRECT --to-ports 22 Note that I use -A to append this command to the end of the chain. You could replace that with -I to insert it at the beginning (or at a specific rulenum). My linux box is running slackware, with a kernel from circa 2001. Hopefully the mechanics of iptables haven't changed since then. The command is untested under any other distros or less outdated kernels. Of course, the command should be easy enough to adapt to whatever service on your linux box you're trying to reach by changing the numbers (and possibly changing tcp to udp, or whatever). Between putty and psftp, however, I'm good to go for hours of time-killing.

Force unmount occupied partition
Alternative if "Lazy unmount" (umount -l) doesn't obey. Alternative for NFS: $ umount -f /media/sdb1 Use with caution: forcing to unmount a busy partition can cause data loss!

Send pop-up notifications on Gnome
The title is optional. Options: -t: expire time in milliseconds. -u: urgency (low, normal, critical). -i: icon path. On Debian-based systems you may need to install the 'libnotify-bin' package. Useful to advise when a wget download or a simulation ends. Example: $ wget URL ; notify-send "Done"

Block an IP address from connecting to a server
This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.

Play all the music in a folder, on shuffle
Play files in shuffle mode with bash and mpg123. Why bother using big-as-hell stuff like mplayer? This will play all your music files contained in */* (in my case author/song.format) with bash and mplayer showing a nice output.

get all Amazon cloud (amazonws etc) ipv4 subnets

Slugify: converts strings in any language into Slugs (friendly names to use in URLs and filenames)
Slug the part of an URL which identifies a page using human-readable keywords. Slugs are used to construct friendly URLs (often for permalinks) that are easy to type, descriptive, and easy to remember.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: