Commands using grep (1,935)


  • 18
    mount -t ntfs-3g -o ro,loop,uid=user,gid=group,umask=0007,fmask=0117,offset=0x$(hd -n 1000000 image.vdi | grep "eb 52 90 4e 54 46 53" | cut -c 1-8) image.vdi /mnt/vdi-ntfs
    Cowboy · 2009-08-23 17:25:07 12
  • This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';} If your version of grep doesn't have perl compatible regex support, then you can use this version: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;} Show Sample Output


    18
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
    eightmillion · 2010-01-29 05:01:11 18
  • Particularly useful on OS X where netstat doesn't have -p option. Show Sample Output


    18
    lsof -i -P | grep -i "listen"
    patko · 2010-10-14 09:37:51 10
  • If you have a bunch of small files that you want to cat to read, you can cat each alone (boring); do a cat *, and you won't see what line is for what file, or do a grep . *. "." will match any string and grep in multifile mode will place a $filename: before each matched line. It works recursively too!! Show Sample Output


    18
    grep . *
    theist · 2011-09-01 09:16:04 10

  • 17
    curl -s http://checkip.dyndns.org/ | grep -o "[[:digit:].]\+"
    lv4tech · 2009-05-14 09:43:31 13
  • Use multiple patterns with grep -v. So you can print all lines in a file except those containing the multiple patterns you specify.


    16
    grep 'test' somefile | grep -vE '(error|critical|warning)'
    zlemini · 2009-10-23 23:21:36 12
  • find all computer connected to my host through TCP connection. Show Sample Output


    16
    netstat -lantp | grep ESTABLISHED |awk '{print $5}' | awk -F: '{print $1}' | sort -u
    bitbasher · 2011-07-21 21:23:10 16
  • This will give you the Dell Service tag number associated with your machine. Incredibly useful when you need that number for tech support or downloads. Show Sample Output


    15
    sudo dmidecode | grep Serial\ Number | head -n1
    nlinux · 2009-02-18 14:54:28 3187
  • This command takes the output of the 'last' command, removes empty lines, gets just the first field ($USERNAME), sort the $USERNAMES in reverse order and then gives a summary count of unique matches. Show Sample Output


    15
    last | grep -v "^$" | awk '{ print $1 }' | sort -nr | uniq -c
    hkyeakley · 2009-02-18 16:38:59 13
  • Several times, I find myself hitting my up arrow, and changing the search term. Unfortunately, I find myself wasting too much time typing: grep kernel /var/log/messages Redirecting STDIN allows me to put the search term at the end so I less cursor movement to change what I'm searching for: < /var/log/messages grep kernel If you're using the emacs keyboard binding, then after you press your up arrow, press CTRL+w to erase the word. If this has already been submitted, I couldn't find it with the search utility.


    15
    < /path/to/file.txt grep foo
    atoponce · 2009-03-29 02:43:40 16
  • Just a simple way without the need of additional tools. Of course, replace eth0 with your IF. Show Sample Output


    15
    while [ /bin/true ]; do OLD=$NEW; NEW=`cat /proc/net/dev | grep eth0 | tr -s ' ' | cut -d' ' -f "3 11"`; echo $NEW $OLD | awk '{printf("\rin: % 9.2g\t\tout: % 9.2g", ($1-$3)/1024, ($2-$4)/1024)}'; sleep 1; done
    hons · 2011-03-22 10:02:23 8
  • Usefull for when you don't have nmap and need to find a missing host. Pings all addresses from 10.1.1.1 to 10.1.1.254, modify for your subnet. Timeout set to 1 sec for speed, if running over a slow connection you should raise that to avoid missing replies. This will clean up the junk, leaving just the IP address: for i in {1..254}; do ping -c 1 -W 1 10.1.1.$i | grep 'from' | cut -d' ' -f 4 | tr -d ':'; done Show Sample Output


    14
    for i in {1..254}; do ping -c 1 -W 1 10.1.1.$i | grep 'from'; done
    SuperJediWombat · 2010-04-07 16:57:53 7
  • For automated unit tests I wanted my program to run normally, but if it crashed, to add a stack trace to the output log. I came up with this command so I wouldn't have to mess around with core files. The one downside is that it does smoosh your program's stderr and stdout together. Show Sample Output


    14
    gdb -batch -ex "run" -ex "bt" ${my_program} 2>&1 | grep -v ^"No stack."$
    kurt · 2010-12-29 17:46:31 28
  • Grep will read the contents of each file in PWD and will use the REs $1 $2 ... $n to match the contents. In case of match, grep will print the appropriate file, line number and the matching line. It's just easier to write ff word1 word2 word3 Instead of grep -rinE 'word1|word2|word3' . Show Sample Output


    14
    ff() { local IFS='|'; grep -rinE "$*" . ; }
    RanyAlbeg · 2011-06-10 10:25:10 137

  • 14
    curl -s https://api.github.com/users/<username>/repos?per_page=1000 |grep git_url |awk '{print $2}'| sed 's/"\(.*\)",/\1/'
    wuseman1 · 2019-11-19 20:31:19 262
  • Purge all configuration files of removed packages Show Sample Output


    13
    sudo aptitude purge `dpkg --get-selections | grep deinstall | awk '{print $1}'`
    kelevra · 2009-04-28 11:44:04 15
  • Finds all corrupted jpeg files in current directory and its subdirectories. Displays the error or warning found. The jpeginfo is part of the jpeginfo package in debian. Should you wish to only get corrupted filenames, use cut to extract them : find ./ -name *jpg -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR" | cut -d " " -f 1 Show Sample Output


    13
    find . -name "*jpg" -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR"
    vincentp · 2009-06-03 22:08:48 11
  • This one uses dictionary.com


    13
    pronounce(){ wget -qO- $(wget -qO- "http://dictionary.reference.com/browse/$@" | grep 'soundUrl' | head -n 1 | sed 's|.*soundUrl=\([^&]*\)&.*|\1|' | sed 's/%3A/:/g;s/%2F/\//g') | mpg123 -; }
    matthewbauer · 2010-03-13 04:23:56 12
  • Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point. This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact. If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read. wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" The file lastfact, can then be examined whenever the command fails.


    13
    wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
    tali713 · 2010-03-30 23:49:30 81
  • Trick to avoid the form: grep process | grep - v grep Show Sample Output


    13
    ps axu | grep [a]pache2
    EBAH · 2012-12-15 19:37:19 33
  • Put it in your ~/.bashrc usage: google word1 word2 word3... google '"this search gets quoted"' Show Sample Output


    13
    function google { Q="$@"; GOOG_URL='https://www.google.de/search?tbs=li:1&q='; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}${Q//\ /+}" | grep -oP '\/url\?q=.+?&amp' | sed 's|/url?q=||; s|&amp||'); echo -e "${stream//\%/\x}"; }
    michelsberg · 2013-04-05 08:04:15 9
  • The trick here is to use the brackets [ ] around any one of the characters of the grep string. This uses the fact that [?] is a character class of one letter and will be removed when parsed by the shell. This is useful when you want to parse the output of grep or use the return value in an if-statement without having its own process causing it to erroneously return TRUE. Show Sample Output


    12
    ps aux | grep "[s]ome_text"
    SiegeX · 2009-02-17 02:10:50 12
  • greps for search word in directory and below (defaults to cd). -i case insensitive -n shows line number -H shows file name


    12
    grep --color=auto -iRnH "$search_word" $directory
    tobiasboon · 2009-02-21 19:16:33 18
  • Highlights the search pattern in red.


    12
    grep -i --color=auto
    P17 · 2009-04-27 15:03:28 8
  • This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.


    12
    cat /var/lib/dpkg/info/*.list > /tmp/listin ; ls /proc/*/exe |xargs -l readlink | grep -xvFf /tmp/listin; rm /tmp/listin
    kamathln · 2009-09-09 18:09:14 14
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find and copy scattered mp3 files into one directory
No problem with word splitting. That should works on many Unix likes.

Stop Flash from tracking everything you do.
Brute force way to block all LSO cookies on a Linux system with the non-free Flash browser plugin. Works just fine for my needs. Enjoy.

Enter parameter if empty (script becomes interactive when parameters are missing)
Can be used for command line parameters too. If you have a more complicated way of entering values (validation, GUI, ...), then write a function i.e. EnterValue() that echoes the value and then you can write: $ param=${param:-$(EnterValue)}

Change user within ssh session retaining the current MIT cookie for X-forwarding
When you remotely log in like "ssh -X userA:host" and become a different user with "su UserB", X-forwarding will not work anymore since /home/UserB/.Xauthority does not exist. This will use UserA's information stored in .Xauthority for UserB to enable X-forwarding. Watch http://prefetch.net/blog/index.php/2008/04/05/respect-my-xauthority/ for details.

Cut out a piece of film from a file. Choose an arbitrary length and starting time.
With: -vcodec, you choose what video codec the new file should be encoded with. Run ffmpeg -formats E to list all available video and audio encoders and file formats. copy, you choose the video encoder that just copies the file. -acodec, you choose what audio codec the new file should be encoded with. copy, you choose the audio encoder that just copies the file. -i originalfile, you provide the filename of the original file to ffmpeg -ss 00:01:30, you choose the starting time on the original file in this case 1 min and 30 seconds into the film -t 0:0:20, you choose the length of the new film newfile, you choose the name of the file created. Here is more information of how to use ffmpeg: http://www.ffmpeg.org/ffmpeg-doc.html

Validating a file with checksum
Makes sure the contents of "myfile" are the same contents that the author intended given the author's md5 hash of that file ("c84fa6b830e38ee8a551df61172d53d7").

find files ignoring .svn and its decendents

Converts multiple youtube links to mp3 files
Usage: ytmp3 "YTurl" "YTurl2" "YTurl3" "YTurlN" Uses the shift command to let you extract the .mp3 from as many youtube urls as you like (or wherever else youtube-dl is supported) *Requires youtube-dl Orginal chunk of code: youtube-dl -q -t --extract-audio --audio-format mp3 URL taken from here http://www.commandlinefu.com/commands/view/9701/convert-youtube-videos-to-mp3

list files recursively by size

Run a script in parrallel over ssh
Runs a local script over ssh assuming ssh keys are in place. -P argument prints results to stdout. # Uses - https://code.google.com/p/parallel-ssh/


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: