Commands using grep (1,935)

  • I found this command on a different site and thought you guy might enjoy it. Just change "YOURSEARCH" to what ever you want to search. Example, "Linux Commands"


    9
    Q="YOURSEARCH"; GOOG_URL="http://www.google.com/search?q="; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}\"${Q/\ /+}\"" | grep -oP '\/url\?q=.+?&amp' | sed 's/\/url?q=//;s/&amp//'); echo -e "${stream//\%/\x}"
    techie · 2013-04-03 09:56:41 12
  • example: -------------------------------------------------------------------------------------------- user@ubuntu:~/workspace/SVN/haystak-repos/trunk/internal/src$ addpi -------------------------------------------------------------------------------------------- Now that directory is in the list of fast access directories. You can switch to it anytime like this: -------------------------------------------------------------------------------------------- user@ubuntu:~$ pi internal` user@ubuntu:~/workspace/SVN/haystak-repos/trunk/internal/src$ -------------------------------------------------------------------------------------------- Please note the backquote ( the symbol that shares its key with ~ in the keyboard ) pi will switch you to that directory. To see the list of all fast access directories you have to say "cat ~/.pi"


    8
    alias pi='`cat ~/.pi | grep ' ; alias addpi='echo "cd `pwd`" >> ~/.pi'
    senthil · 2009-02-05 15:46:59 30

  • 8
    svn st | grep "^\?" | awk "{print \$2}" | xargs svn add $1
    mk · 2009-02-05 17:28:53 31

  • 8
    gunzip -c /var/log/auth.log.*.gz | cat - /var/log/auth.log /var/log/auth.log.0 | grep "Invalid user" | awk '{print $8;}' | sort | uniq -c | less
    eanx · 2009-03-03 04:26:57 16
  • exclude-dir option requires grep 2.5.3


    8
    grep -r --exclude-dir=.svn PATTERN PATH
    patko · 2009-03-04 23:21:50 7
  • Suppose you made a backup of your hard disk with dd: dd if=/dev/sda of=/mnt/disk/backup.img This command enables you to mount a partition from inside this image, so you can access your files directly. Substitute PARTITION=1 with the number of the partition you want to mount (returned from sfdisk -d yourfile.img). Show Sample Output


    8
    INFILE=/path/to/your/backup.img; MOUNTPT=/mnt/foo; PARTITION=1; mount "$INFILE" "$MOUNTPT" -o loop,offset=$[ `/sbin/sfdisk -d "$INFILE" | grep "start=" | head -n $PARTITION | tail -n1 | sed 's/.*start=[ ]*//' | sed 's/,.*//'` * 512 ]
    Alanceil · 2009-03-06 21:29:13 11
  • the good: Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.4 with Suhosin-Patch the bad: Server: Microsoft-IIS/6.0 and the ugly: Server: Apache/2.2.10 (Win32) mod_ssl/2.2.10 OpenSSL/0.9.8i PHP/5.2.6


    8
    wget -S -O/dev/null "INSERT_URL_HERE" 2>&1 | grep Server
    asmoore82 · 2009-03-09 06:54:54 7
  • Searches the /var/log/secure log file for Failed and/or invalid user log in attempts. Show Sample Output


    8
    cat /var/log/secure | grep sshd | grep Failed | sed 's/invalid//' | sed 's/user//' | awk '{print $11}' | sort | uniq -c | sort -n
    empulse · 2009-03-30 15:48:24 18
  • By putting the "-not \( -name .svn -prune \)" in the very front of the "find" command, you eliminate the .svn directories in your find command itself. No need to grep them out. You can even create an alias for this command: alias svn_find="find . -not \( -name .svn -prune \)" Now you can do things like svn_find -mtime -3


    8
    find . -not \( -name .svn -prune \) -type f -print0 | xargs --null grep <searchTerm>
    qazwart · 2009-07-08 20:08:05 10
  • Connect to a machine running ssh using mac address by using the "arp" command Show Sample Output


    8
    ssh root@`for ((i=100; i<=110; i++));do arp -a 192.168.1.$i; done | grep 00:35:cf:56:b2:2g | awk '{print $2}' | sed -e 's/(//' -e 's/)//'`
    gean01 · 2009-09-09 04:32:20 15

  • 8
    curl -s checkip.dyndns.org | grep -Eo '[0-9\.]+'
    hugin · 2009-10-26 09:15:31 3
  • -R, -r, --recursive Read all files under each directory, recursively; this is equivalent to the -d recurse option.


    8
    grep -r -i "phrase" directory/
    TheFox · 2010-01-26 16:27:00 8
  • Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file. Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.


    8
    export QQ=$(mktemp -d);(cd $QQ; curl -s -O http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25];for i in $(perl -ne 'print "$1\n" if( /^(\w+\(\))/ )' *|sort -u);do grep -h -m1 -B1 $i *; done)|grep -v '^--' > clf.sh;rm -r $QQ
    bartonski · 2010-01-30 19:47:42 40

  • 8
    grep -hIr :name ~/.mozilla/firefox/*.default/extensions | tr '<>=' '"""' | cut -f3 -d'"' | sort -u
    whiskybar · 2010-05-13 15:59:51 5
  • This should do the same thing and is about 70 chars shorter. Show Sample Output


    8
    aptitude remove $(dpkg -l|egrep '^ii linux-(im|he)'|awk '{print $2}'|grep -v `uname -r`)
    dbbolton · 2010-06-10 21:23:00 10

  • 8
    ls |tee >(grep xxx |wc >xxx.count) >(grep yyy |wc >yyy.count) |grep zzz |wc >zzz.count
    nottings · 2010-10-25 17:49:14 3
  • Trac 0.12.2-stable Show Sample Output


    8
    grep --include=*.py -lir "delete" .
    evandrix · 2011-08-17 13:18:43 7
  • Someone over at Mozilla dot Org probably said, "I know, let's create a super-duper universal replacement for browser cookies that are persistent and even more creepy and then NOT give our browser users the tools they need to monitor, read, block or selectively remove them!" . This will let you see all the DOM object users in all your firefox profiles. Feel free to toss a `| sort -u` on the end to remove dupes. . I highly recommend you treat these as "session cookies" by scripting something that deletes this sqlite database during each firefox start-up. . note: does not do anything for so-called "flash cookies" Show Sample Output


    8
    strings ~/.mozilla/firefox/*/webappsstore.sqlite|grep -Eo "^.+\.:" |rev
    unixmonkey365 · 2011-09-26 15:23:09 10
  • works in bash Show Sample Output


    8
    grep $'\t' sample.txt
    knoppix5 · 2012-02-21 10:54:56 10
  • Certain Flash video players (e.g. Youtube) write their video streams to disk in /tmp/ , but the files are unlinked. i.e. the player creates the file and then immediately deletes the filename (unlinking files in this way makes it hard to find them, and/or ensures their cleanup if the browser or plugin should crash etc.) But as long as the flash plugin's process runs, a file descriptor remains in its /proc/ hierarchy, from which we (and the player) still have access to the file. The method above worked nicely for me when I had 50 tabs open with Youtube videos and didn't want to have to re-download them all with some tool.


    8
    lsof -n -P|grep FlashXX|awk '{ print "/proc/" $2 "/fd/" substr($4, 1, length($4)-1) }'|while read f;do newname=$(exiftool -FileModifyDate -FileType -t -d %Y%m%d%H%M%S $f|cut -f2|tr '\n' '.'|sed 's/\.$//');echo "$f -> $newname";cp $f ~/Vids/$newname;done
    mhs · 2012-02-25 01:49:45 5
  • Uses GNU Parallel. Show Sample Output


    8
    timeDNS() { parallel -j0 --tag dig @{} "$*" ::: 208.67.222.222 208.67.220.220 198.153.192.1 198.153.194.1 156.154.70.1 156.154.71.1 8.8.8.8 8.8.4.4 | grep Query | sort -nk5; }
    unixmonkey74668 · 2015-04-26 08:22:32 31
  • bash.org is a collection of funny quotes from IRC. WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them... Thanks to Chen for the idea and initial version! This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one. Just barely under the required 255 chars :) Improvment: You can replace the head -1 at the end by: awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt which will separate the quotes with a "%" and place it in the file. and then: strfile bash_quotes.txt which will make the file ready for the fortune command and then you can: fortune bash_quotes.txt which will give you a random quote from those in the downloaded file. I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal. Show Sample Output


    7
    curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
    Iftah · 2009-05-07 13:13:21 17
  • Solves "tr" issues with non C-locales under BSD-like systems (like OS X)


    7
    LC_ALL=C tr -c "[:digit:]" " " < /dev/urandom | dd cbs=$COLUMNS conv=unblock | GREP_COLOR="1;32" grep --color "[^ ]"
    zzambia · 2009-07-02 07:10:33 11
  • Remove newlines from output. One character shorter than awk /./ filename and doesn't use a superfluous cat. To be fair though, I'm pretty sure fraktil was thinking being able to nuke newlines from any command is much more useful than just from one file.


    7
    grep . filename
    TheMightyBuzzard · 2009-08-09 05:33:58 12
  • Returns nothing if the domain exists and 'No match for domain.com' otherwise.


    7
    whois domainnametocheck.com | grep match
    Timothee · 2009-08-11 13:33:25 17
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Archive all SVN repositories in platform indepenent form
This command dumps all SVN repositories inside of folder "repMainPath" (not recursively) to the folder "dumpPath", where one dump file will be created for each SVN repository.

find and delete empty dirs, start in current working dir
A quick way to find and delete empty dirs, it starts in the current working directory. If you do find . -empty -type d you will see what could be removed, or to a test run.

pipe output of a command to your clipboard
In turn you can get the contents of your clipboard by typing xsel by itself with no arguments: $ xsel This command requires you to install the xsel utility which is free

Remove all mail in Postfix mail queue.

list block devices
Shows all block devices in a tree with descruptions of what they are.

make computer speaking to you :)
you can listen to your computer, but don't be carried away

Find ASCII files and extract IP addresses

Get a Bulleted List of SVN Commits By a User for a Specifc Day (Daily Work Log)
* Replace USERNAME with the desired svn username * Replace the first YYYY-MM-DD with the date you want to get the log (this starts at the midnight event that starts this date) * Replace the second YYYY-MM-DD with the date after you want to get the log (this will end the log scan on midnight of the previous day) Example, if I want the log for December 10, 2010, I would put {2010-12-10}:{2010-12-11}

Crash bash, in case you ever want to for whatever reason
This is a very hackish way to do it that I'm mainly just posting for fun, and I guess technically can more accurately be said to result in undefined behavior. What the command does is tell the shell to treat libpng like it's a shell plugin (which it's most certainly not) and attempt to install a "png_create_read" command from the library. It looks for the struct with the information about the command; since it's always the command name followed by "_struct", it'll look for a symbol called "png_create_read_struct". And it finds it, since this is the name of one of libpng's functions. But bash has no way to tell it's a function instead of a struct, so it goes ahead and parses the function's code as if it was command metadata. Inevitably, bash will attempt to dereference an invalid pointer or whatever, resulting in a segfault.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: