Commands matching grep (2,284)

  • ps and grep is a dangerous combination -- grep tries to match everything on each line (thus the all too common: grep -v grep hack). ps -C doesn't use grep, it uses the process table for an exact match. Thus, you'll get an accurate list with: ps -fC sh rather finding every process with sh somewhere on the line. Show Sample Output


    14
    ps -fC PROCESSNAME
    pooderbill · 2015-04-20 13:09:44 17

  • 14
    curl -s https://api.github.com/users/<username>/repos?per_page=1000 |grep git_url |awk '{print $2}'| sed 's/"\(.*\)",/\1/'
    wuseman1 · 2019-11-19 20:31:19 262
  • checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.


    13
    svn status |grep '\?' |awk '{print $2}'| xargs svn add
    xsawyerx · 2009-01-29 10:33:22 81
  • Purge all configuration files of removed packages Show Sample Output


    13
    sudo aptitude purge `dpkg --get-selections | grep deinstall | awk '{print $1}'`
    kelevra · 2009-04-28 11:44:04 15
  • Finds all corrupted jpeg files in current directory and its subdirectories. Displays the error or warning found. The jpeginfo is part of the jpeginfo package in debian. Should you wish to only get corrupted filenames, use cut to extract them : find ./ -name *jpg -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR" | cut -d " " -f 1 Show Sample Output


    13
    find . -name "*jpg" -exec jpeginfo -c {} \; | grep -E "WARNING|ERROR"
    vincentp · 2009-06-03 22:08:48 11
  • This is a command that I find myself using all the time. It works like regular grep, but returns the paragraph containing the search pattern instead of just the line. It operates on files or standard input. grepp <PATTERN> <FILE> or <SOMECOMMAND> | grepp <PATTERN> Show Sample Output


    13
    grepp() { [ $# -eq 1 ] && perl -00ne "print if /$1/i" || perl -00ne "print if /$1/i" < "$2";}
    eightmillion · 2010-01-12 04:30:15 13
  • This one uses dictionary.com


    13
    pronounce(){ wget -qO- $(wget -qO- "http://dictionary.reference.com/browse/$@" | grep 'soundUrl' | head -n 1 | sed 's|.*soundUrl=\([^&]*\)&.*|\1|' | sed 's/%3A/:/g;s/%2F/\//g') | mpg123 -; }
    matthewbauer · 2010-03-13 04:23:56 12
  • Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point. This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact. If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read. wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" The file lastfact, can then be examined whenever the command fails.


    13
    wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
    tali713 · 2010-03-30 23:49:30 81

  • 13
    apt-get install `ssh root@host_you_want_to_clone "dpkg -l | grep ii" | awk '{print $2}'`
    BoxingOctopus · 2011-05-10 13:33:51 7
  • Trick to avoid the form: grep process | grep - v grep Show Sample Output


    13
    ps axu | grep [a]pache2
    EBAH · 2012-12-15 19:37:19 33
  • Put it in your ~/.bashrc usage: google word1 word2 word3... google '"this search gets quoted"' Show Sample Output


    13
    function google { Q="$@"; GOOG_URL='https://www.google.de/search?tbs=li:1&q='; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}${Q//\ /+}" | grep -oP '\/url\?q=.+?&amp' | sed 's|/url?q=||; s|&amp||'); echo -e "${stream//\%/\x}"; }
    michelsberg · 2013-04-05 08:04:15 9
  • Rather than chain a string of greps together and pipe them to awk, use awk to do all the work. In the above example, a string would be output to stdout if it matched pattern1 AND pattern2, but NOT pattern3.


    12
    awk '/pattern1/ && /pattern2/ && !/pattern3/ {print}'
    themensch · 2009-02-05 15:18:19 41
  • usefull in case of abuser/DoS attacks. Show Sample Output


    12
    netstat -anp |grep 'tcp\|udp' | awk '{print $5}' | sed s/::ffff:// | cut -d: -f1 | sort | uniq -c | sort -n
    dt · 2009-02-15 09:16:16 19
  • The trick here is to use the brackets [ ] around any one of the characters of the grep string. This uses the fact that [?] is a character class of one letter and will be removed when parsed by the shell. This is useful when you want to parse the output of grep or use the return value in an if-statement without having its own process causing it to erroneously return TRUE. Show Sample Output


    12
    ps aux | grep "[s]ome_text"
    SiegeX · 2009-02-17 02:10:50 12
  • greps for search word in directory and below (defaults to cd). -i case insensitive -n shows line number -H shows file name


    12
    grep --color=auto -iRnH "$search_word" $directory
    tobiasboon · 2009-02-21 19:16:33 18
  • Highlights the search pattern in red.


    12
    grep -i --color=auto
    P17 · 2009-04-27 15:03:28 8
  • This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.


    12
    cat /var/lib/dpkg/info/*.list > /tmp/listin ; ls /proc/*/exe |xargs -l readlink | grep -xvFf /tmp/listin; rm /tmp/listin
    kamathln · 2009-09-09 18:09:14 14
  • Define a function vert () { echo $1 | grep -o '.'; } Use it to print some column headers paste <(vert several) <(vert parallel) <(vert vertical) <(vert "lines of") <(vert "text can") <(vert "be used") <(vert "for labels") <(vert "for columns") <(vert "of numbers") Show Sample Output


    12
    echo "vertical text" | grep -o '.'
    dennisw · 2009-09-11 03:45:04 10
  • no loop, only one call of grep, scrollable ("less is more", more or less...)


    12
    ls /usr/bin | xargs whatis | grep -v nothing | less
    michelsberg · 2010-01-26 12:59:47 32

  • 12
    grep -Fxv -f file1 file2
    zarathud · 2010-05-28 14:48:27 4
  • Proper screencast with audio using ffmpeg and x264, as per http://verb3k.wordpress.com/2010/01/26/how-to-do-proper-screencasts-on-linux/


    12
    ffmpeg -y -f alsa -ac 2 -i pulse -f x11grab -r 30 -s `xdpyinfo | grep 'dimensions:'|awk '{print $2}'` -i :0.0 -acodec pcm_s16le output.wav -an -vcodec libx264 -vpre lossless_ultrafast -threads 0 output.mp4
    NoahY · 2010-11-19 09:31:56 7
  • When dealing with system resource limits like max number of processes and open files per user, it can be hard to tell exactly what's happening. The /etc/security/limits.conf file defines the ceiling for the values, but not what they currently are, while ulimit -a will show you the current values for your shell, and you can set them for new logins in /etc/profile and/or ~/.bashrc with a command like: ulimit -S -n 100000 >/dev/null 2>&1 But with the variability in when those files get read (login vs any shell startup, interactive vs non-interactive) it can be difficult to know for sure what values apply to processes that are currently running, like database or app servers. Just find the PID via "ps aux | grep programname", then look at that PID's "limits" file in /proc. Then you'll know for sure what actually applies to that process. Show Sample Output


    12
    cat /proc/PID/limits
    dmmst19 · 2011-12-14 16:49:06 7

  • 12
    sudo dmidecode | grep Product
    bbbco · 2012-02-07 16:26:23 17
  • This command uses ping to get the routers' IP addresses to the destination host as traceroute does. If you know what I mean..


    12
    for i in {1..30}; do ping -t $i -c 1 google.com; done | grep "Time to live exceeded"
    6bc98f7f · 2012-02-19 13:37:04 8
  • ..not guaranteed to always be accurate but fun to see how old you Linux installation is based on the root partitions file system creation date. Show Sample Output


    12
    sudo tune2fs -l $(df -h / |(read; awk '{print $1; exit}')) | grep -i created
    thechile · 2013-08-08 15:18:09 17
  •  < 1 2 3 4 5 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

display typedefs, structs, unions and functions provided by a header file
will display typedefs, structs, unions and functions declared in 'stdio.h'(checkout _IO_FILE structure). It will be helpful if we want to know what a particular header file will offer to us. Command 'cpp' is GNU's C Preprocessor.

Transform a portrait pdf in a landscape one with 2 pages per page
This is an example of the usage of pdfnup (you can find it in the 'pdfjam' package). With this command you can save ink/toner and paper (and thus trees!) when you print a pdf. This tools are very configurable, and you can make also 2x2, 3x2, 2x3 layouts, and more (the limit is your fantasy and the resolution of the printer :-) You must have installed pdfjam, pdflatex, and the LaTeX pdfpages package in your box.

Produce a pseudo random password with given length in base 64

Merge several pdf files into a single file
merge a.pdf b.pdf and c.pdf and create ./out.pdf

Watch Star Wars via telnet
Use Ctrl-] to stop it.

find and grep Word docs
Find Word docs by filename in the current directory, convert each of them to plain text using antiword (taking care of spaces in filenames), then grep for a search term in the particular file. (Of course, it's better to save your data as plain text to make for easier grepping, but that's not always possible.) Requires antiword. Or you can modify it to use catdoc instead.

find unreadable file

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: