Commands using grep (1,935)

  • Poor email reputation got you down? Perhaps you're unknowingly forwarding every spam email that makes it through to info@website.com to website@gmail.com. This command outputs every forwarding address set up within a Zimbra installation.


    0
    for i in `zmprov -l gaa | cut -f2 -d"@" | uniq -c | awk '{print$2}'`; do zmprov -l gaa -v $i | grep -i zimbraPrefMailForwardingAddress; done
    skylineservers · 2014-11-17 15:24:46 8
  • ipscore <your ip> number ipscore 186.78.151.135 2 a high score represents a bad remote address (honeypot, tor, botnet..) Show Sample Output


    0
    function ipscore() { local OLD_IFS="$IFS" IFS=","; local result="`curl -s "http://wafsec.com/api?ip=$1"`" && local results=(${result}) && printf -- '%s\n' "${results[@]}" | grep '"Score":' | cut -d':' -f2; IFS="$OLD_IFS"; }; ipscore ${target_ip}
    LoadLow · 2014-11-20 23:18:46 8

  • 0
    find <directory> -print -iname "*.jar" -exec jar -ftv '{}' \;|grep -E "jar|<classname>"
    vivek_saini07 · 2014-11-22 20:17:38 7

  • 0
    pipi () {pip install $1 && echo $(pip freeze | grep -i $1) >> requirements.txt;}
    jkatzer · 2014-12-02 20:55:48 9

  • 0
    sudo netstat -tulpn | grep :8080
    zluyuer · 2014-12-04 06:11:09 9

  • 0
    df -h /home | grep -v Filesystem | awk '{print $5}' | sed -n '/%/p' for disk usage
    rlinux57 · 2014-12-11 16:58:29 8
  • grep : This shows the process running on that specific port.


    0
    sudo lsof -i | grep :8080
    cptjack · 2014-12-15 10:08:31 8
  • To help store and keep important but not often used commands I resorted to this. A basic for loop which when fed separate commands for its input searches the history and any references to that command or string gets appended to a file named [command name]_hist.txt Revising it to the following though ti include root / sudo'd commands is probably critical the output above reflects the change, here below: for i in docker elinks ufw fail2ban awk sed grep diff nginx apt bash for function bower github rsync sshfs who scp sftp tugboat aws pip npm ssh mysql php 8000 8080 3000 python serve s3ql s3cmd s3api s3 bash init wget; do cat /home/ray/.bash_history |grep -i "$i" >> /home/ray/histories/"${i}"_hist.txt;sudo cat /root/.bash_history |grep -i "$i" >> /home/ray/histories/"${i}"_sudo_hist.txt;done then a simple more to look for a particular result more -s -40 -p -f -d tugboat*txt simple, solved my problem and alerted me to a lack of certain appearances of commands that signal a bit of an issue Not so sold on the usefulness as to warrant a bash function or further convenience or logic we shall see. Could use some tweaking but what commands dont! Show Sample Output


    0
    for i in [enter list of commands]; do history |grep -i "$i" >> ~/histories/"${i}"_hist.txt;done
    rayanthony · 2014-12-16 03:37:02 8

  • 0
    scp -r `ls | grep -vE "(Pattern1|Pattern2)"` user@remote_host:/location
    zluyuer · 2014-12-16 04:07:35 8
  • grep for specific function invocations in this case, wither "emit" or "on" with "leader".


    0
    grep -E -rn --color=always --exclude-dir=".svn" --exclude-dir="packages" --exclude="*.swp" "(emit|on)\([\'\"]leader" ~/project/ | less -R
    hochmeister · 2014-12-23 20:08:25 9
  • In the field, I needed to script a process to scan a specific vendor devices in the network. With the help of nmap, I got all the devices of that particular vendor, and started a scripted netcat session to download configuration files from a tftp server. This is the nmap loop (part of the script). You can however, add another pipe with grep to filter the vendor/manufacturer devices only. If want to check the whole script, check in http://pastebin.com/ju7h4Xf4 Show Sample Output


    0
    nmap -sP 10.0.0.0/8 | grep -v "Host" | tail -n +3 | tr '\n' ' ' | sed 's|Nmap|\nNmap|g' | grep "MAC Address" | cut -d " " -f5,8-15
    jaimerosario · 2014-12-26 18:31:53 13

  • 0
    for containerId in $(docker ps | awk '{print $1}' | grep -v CONTAINER); do docker inspect -f "{{ .Name }}" $containerId | sed 's#/##' ; docker port $containerId; done
    bradym · 2015-01-02 19:54:28 8
  • 1. There is no use of '--color=auto' in front of a pipe--instead with '--color=always' grep will mark the section headings. 2. I suppose the use of grep with '-A 900' or '-B 900' respectively a 'dirty trick'--sed can do 'exactly' what we want, however, grep does the nice colouring (see 1.) 3. Cutting of the tail (everthing starting with 'Weitere Aktionen') first leads to no output if leo doesn't no the translation. Show Sample Output


    0
    leo () { lang=en; IFS=+; Q="${*// /%20}"; curl -s "https://dict.leo.org/${lang}de/?search=${Q//+/%20}" | html2text | sed -e '/Weitere Aktionen/,$d' | grep --color=auto --color=always -EA 900 '^\*{5} .*$' }
    jandclilover · 2015-01-09 13:58:36 8
  • I was looking for an easy solution where I could list all of the directories that had a specific file, not to replace it, but more of providing a list to a third-party or for my own reference. Show Sample Output


    0
    find $(pwd) | grep README.txt
    shanford · 2015-01-16 16:58:04 8
  • Assumes you are in the branch you want to run the check on. Sub 'develop' for whatever branch you commonly submit PRs to. Show Sample Output


    0
    git rev-parse develop | xargs git diff --name-only | grep -E '^(app|lib|spec).*\.rb' | xargs rubocop -f simple
    vinniefranco · 2015-01-21 08:12:18 8
  • This is much easier to parse and do something else with (eg: automagically create ZFS vols) than anything else I've found. It also helps me keep track of which disks are which, for example, when I want to replace a disk, or image headers in different scenarios. Being able to match a disk to the kernels mapping of said drive the disks serial number is very helpful ls -l /dev/disk/by-id Normal `ls` command to list contents of /dev/disk/by-id grep -v "wwn-" Perform an inverse search - that is, only output non-matches to the pattern 'wwn-' egrep "[a-zA-Z]{3}$" A regex grep, looking for three letters and the end of a line (to filter out fluff) sed 's/\.\.\/\.\.\///' Utilize sed (stream editor) to remove all occurrences of "../../" sed -E 's/.*[0-9]{2}:[0-9]{2}\s//' Strip out all user and permission fluff. The -E option lets us use extended (modern) regex notation (larger control set) sed -E 's/->\ //' Strip out ascii arrows "-> " sort -k2 Sort the resulting information alphabetically, on column 2 (the disk letters) awk '{print $2,$1}' Swap the order of the columns so it's easier to read/utilize output from sed 's/\s/\t/' Replace the space between the two columns with a tab character, making the output more friendly For large ZFS pools, this made creating my vdevs immeasurably easy. By keeping track of which disks were in which slot (spreadsheet) via their serial numbers, I was able to then create my vols simply by copying and pasting the full output of the disk (not the letter) and pasting it into my command. Thereby allowing me to know exactly which disk, in which slot, was going into the vdev. Example command below. zpool create tank raidz2 -o ashift=12 ata-... ata-... ata-... ata-... ata-... ata-... Show Sample Output


    0
    ls -l /dev/disk/by-id |grep -v "wwn-" |egrep "[a-zA-Z]{3}$" |sed 's/\.\.\/\.\.\///' |sed -E 's/.*[0-9]{2}:[0-9]{2}\s//' |sed -E 's/->\ //' |sort -k2 |awk '{print $2,$1}' |sed 's/\s/\t/'
    lig0n · 2015-01-25 19:29:40 8
  • # grab the first line showing php version php -i | grep 'PHP Version' | awk '{if(NR==1)print}' php -i | grep 'PHP Version' | sed -n '1!p' php -i | grep 'PHP Version' | tail -n 1 Show Sample Output


    0
    php -i | grep 'PHP Version' | awk '{if(NR==1)print}'
    crisuwork · 2015-01-27 11:12:19 7
  • Useful for big systems with lots of cards. (Update: does not work with USB disks)


    0
    udevadm info -q all -n /dev/sdc | grep ID_PATH | cut -d'-' -f 2 | xargs -n 1 lspci -s
    mhs · 2015-01-27 15:34:02 9
  • Description by segments delimited by pipe (|) 1. List all git branches 2. Exclude master 3. Trim output and remove display elements such as * next to current branch 4. Repeat branch name after a space (output on each line: branch_name branch_name) 5. Prepend each line with the git tag command 6. Execute the output with bash


    0
    git branch | grep -v "master" | sed 's/^[ *]*//' | sed 's/.*/& &/' | sed 's/^/git tag archive\//' | bash
    Trindaz · 2015-01-31 00:26:15 11

  • 0
    cat /dev/urandom | strings | grep -o '[[:alnum:]]' | head -n 15 | tr -d '\n'; echo
    rekky · 2015-01-31 23:07:23 8
  • Warnings and errors will be suppressed Show Sample Output


    0
    grep --include=\*.html -R "some string" . 2>/dev/null
    sjmixon · 2015-02-04 17:59:41 8
  • Finds the date of the first commit in a git repository branch Show Sample Output


    0
    git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
    binaryten · 2015-02-04 19:35:18 13

  • 0
    ifconfig -a | grep inet | awk '{print $2}' | cut -d ":" -f 2 | grep -v 127.0.0.1
    Dairenn · 2015-02-09 19:19:20 8

  • 0
    ls -lt | grep ^- | awk 'NR>=N {print $9}' | xargs rm -rf -i
    clongbupt · 2015-02-10 06:29:24 8
  • us lsof, grep for any pid matching a given name such as "node". Show Sample Output


    0
    lsof -i -n -P | grep -e "$(ps aux | grep node | grep -v grep | awk -F' ' '{print $2}' | xargs | awk -F' ' '{str = $1; for(i = 2; i < NF; i++) {str = str "\\|" $i} print str}')"
    hochmeister · 2015-02-14 23:24:00 10
  • ‹ First  < 52 53 54 55 56 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Schedule Nice Background Commands That Won't Die on Logout - Alternative to nohup and at
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

Open Remote Desktop (RDP) from command line having a custom screen size
This example uses xfreerdp, which builds upon the development of rdesktop. This example usage will also send you the remote machine's sound.

Repeat a command until stopped
In this case it runs the command 'curl localhost:3000/site/sha' waiting the amount of time in sleep, ie: 1 second between runs, appending each run to the console. This works well for any command where the output is less than your line width This is unlike watch, because watch always clears the display.

faster version of ls *
I know its not much but is very useful in time consuming scripts (cron, rc.d, etc).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Record a screencast and convert it to an mpeg
Grab X11 input and create an MPEG at 25 fps with the resolution 800x600

Lists unambigously names of all xml elements used in files in current directory
This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

Recurse through directories easily
This is a simple case of recursing through all directories, adding the '.bak' extension to every file. Of course, the 'cp $file $file.bak' could be any code you need to apply to your recursion, including tests, other functions, creating variables, doing math, etc. Simple and clean recursion.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: