Commands using grep (1,930)

  • As an alternative to using an additional grep -v grep you can use a simple regular expression in the search pattern (first letter is something out of the single letter list ;-)) to drop the grep command itself. Show Sample Output


    72
    ps aux | grep [p]rocess-name
    olorin · 2009-08-13 05:44:45 19

  • 67
    tr -c "[:digit:]" " " < /dev/urandom | dd cbs=$COLUMNS conv=unblock | GREP_COLOR="1;32" grep --color "[^ ]"
    allinurl · 2009-06-30 17:23:49 14
  • Prints a graphical directory tree from your current directory Show Sample Output


    62
    ls -R | grep ":$" | sed -e 's/:$//' -e 's/[^-][^\/]*\//--/g' -e 's/^/ /' -e 's/-/|/'
    unixmonkey842 · 2009-02-15 20:43:21 14
  • This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos... so this a the solution I came up with for that. I am sure this can be done better but this does indeed work... and tends to work far better than YouTube's ghetto proprietary flash player ;-) Replace $i with any YouTube ID you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube's server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed. In some browsers you can replace $i with just a % or put this in a shell script so all YouTube IDs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them) If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube. Edit: 05/03/2011 - Updated line to work with current YouTube. It could be a lot prettier but I will probably follow up with another update when I figure out how to get rid of that pesky Grep. Sed should take that syntax... but it doesn't. Original (no longer working) command: mplayer -fs $(echo "http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n "/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p")") Show Sample Output


    58
    i="8uyxVmdaJ-w";mplayer -fs $(curl -s "http://www.youtube.com/get_video_info?&video_id=$i" | echo -e $(sed 's/%/\\x/g;s/.*\(v[0-9]\.lscache.*\)/http:\/\/\1/g') | grep -oP '^[^|,]*')
    lrvick · 2009-03-09 03:57:44 39

  • 57
    alias 'ps?'='ps ax | grep '
    fzero · 2009-02-05 13:36:37 33
  • Find random strings within /dev/urandom. Using grep filter to just Alphanumeric characters, and then print the first 30 and remove all the line feeds. Show Sample Output


    54
    strings /dev/urandom | grep -o '[[:alnum:]]' | head -n 30 | tr -d '\n'; echo
    jbcurtis · 2009-02-16 00:39:28 21
  • Written for linux, the real example is how to produce ascii text graphs based on a numeric value (anything where uniq -c is useful is a good candidate). Show Sample Output


    53
    netstat -an | grep ESTABLISHED | awk '{print $5}' | awk -F: '{print $1}' | sort | uniq -c | awk '{ printf("%s\t%s\t",$2,$1) ; for (i = 0; i < $1; i++) {printf("*")}; print "" }'
    knassery · 2009-04-27 22:02:19 16
  • This is how I typically grep. -R recurse into subdirectories, -n show line numbers of matches, -i ignore case, -s suppress "doesn't exist" and "can't read" messages, -I ignore binary files (technically, process them as having no matches, important for showing inverted results with -v) I have grep aliased to "grep --color=auto" as well, but that's a matter of formatting not function.


    49
    grep -RnisI <pattern> *
    birnam · 2009-09-22 15:09:43 21
  • just make some data scrolling off the terminal. wow.


    47
    cat /dev/urandom | hexdump -C | grep "ca fe"
    BOYPT · 2010-09-27 08:20:44 18
  • Ever ask yourself "How much data would be lost if I pressed the reset button?" Scary, isn't it? Show Sample Output


    34
    grep ^Dirty /proc/meminfo
    h3xx · 2011-08-24 08:48:49 12
  • I have a bash alias for this command line and find it useful for searching C code for error messages. The -H tells grep to print the filename. you can omit the -i to match the case exactly or keep the -i for case-insensitive matching. This find command find all .c and .h files Show Sample Output


    33
    find . -name "*.[ch]" -exec grep -i -H "search pharse" {} \;
    bunedoggle · 2009-05-06 15:22:49 18
  • Get your colorized grep output in less(1). This involves two things: forcing grep to output colors even though it's not going to a terminal and telling less to handle those properly.


    32
    grep --color=always | less -R
    dinomite · 2009-05-20 20:30:19 8

  • 30
    grep . filename > newfilename
    alvinx · 2009-09-25 09:25:34 12
  • You have an external USB drive or key. Apply this command (using the file path of anything on your device) and it will simulate the unplug of this device. If you just want the port, just type : echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") Show Sample Output


    30
    echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") | sudo tee /sys/bus/usb/drivers/usb/unbind
    tweet78 · 2014-04-06 12:06:29 15
  • PDF files are simultaneously wonderful and heinous. They are wonderful in being ubiquitous and mostly being cross platform. They are heinous in being very difficult to work with from the command line, search, grep, use only the text inside the PDF, or use outside of proprietary products. xpdf is a wonderful set of PDF tools. It is on many linux distros and can be installed on OS X. While primarily an open PDF viewer for X, xpdf has the tool "pdftotext" that can extract formated or unformatted text from inside a PDF that has text. This text stream can then be further processed by grep or other tool. The '-' after the file name directs output to stdout rather than to a text file the same name as the PDF. Make sure you use version 3.02 of pdftotext or later; earlier versions clipped lines. The lines extracted from a PDF without the "-layout" option are very long. More paragraphs. Use just to test that a pattern exists in the file. With "-layout" the output resembles the lines, but it is not perfect. xpdf is available open source at http://www.foolabs.com/xpdf/


    27
    pdftotext [file] - | grep 'YourPattern'
    drewk · 2010-02-14 21:42:35 9

  • 27
    grep -Fx -f file1 file2
    zarathud · 2010-05-28 14:50:14 21

  • 26
    netstat -ant | awk '{print $NF}' | grep -v '[a-z]' | sort | uniq -c
    himynameisthor · 2009-02-05 18:02:59 23
  • recursively traverse the directory structure from . down, look for string "oldstring" in all files, and replace it with "newstring", wherever found also: grep -rl oldstring . |xargs perl -pi~ -e 's/oldstring/newstring'


    25
    $ grep -rl oldstring . |xargs sed -i -e 's/oldstring/newstring/'
    netfortius · 2009-03-03 20:10:19 14
  • This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic. To get a random xkcd comic, I also use the following: xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}


    24
    xkcd(){ wget -qO- http://xkcd.com/|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
    eightmillion · 2009-11-27 09:11:47 19
  • grep searches through a file and prints out all the lines that match some pattern. Here, the pattern is some string that is known to be in the deleted file. The more specific this string can be, the better. The file being searched by grep (/dev/sda1) is the partition of the hard drive the deleted file used to reside in. The ?-a? flag tells grep to treat the hard drive partition, which is actually a binary file, as text. Since recovering the entire file would be nice instead of just the lines that are already known, context control is used. The flags ?-B 25 -A 100? tell grep to print out 25 lines before a match and 100 lines after a match. Be conservative with estimates on these numbers to ensure the entire file is included (when in doubt, guess bigger numbers). Excess data is easy to trim out of results, but if you find yourself with a truncated or incomplete file, you need to do this all over again. Finally, the ?> results.txt? instructs the computer to store the output of grep in a file called results.txt. Source: http://spin.atomicobject.com/2010/08/18/undelete?utm_source=y-combinator&utm_medium=social-media&utm_campaign=technical


    24
    grep -a -B 25 -A 100 'some string in the file' /dev/sda1 > results.txt
    olalonde · 2010-08-19 20:07:42 16
  • -l outputs only the file names -i ignores the case -r descends into subdirectories


    23
    grep -lir "some text" *
    decept · 2009-05-20 19:44:35 7
  • Change the -p argument for the port number. See "man nmap" for different ways to specify address ranges. Show Sample Output


    19
    nmap -sT -p 80 -oG - 192.168.1.* | grep open
    bendavis78 · 2009-02-11 17:47:27 9
  • When you fill a formular with Firefox, you see things you entered in previous formulars with same field names. This command list everything Firefox has registered. Using a "delete from", you can remove anoying Google queries, for example ;-)


    19
    cd ~/.mozilla/firefox/ && sqlite3 `cat profiles.ini | grep Path | awk -F= '{print $2}'`/formhistory.sqlite "select * from moz_formhistory" && cd - > /dev/null
    klipz · 2009-04-13 20:23:37 8
  • Usage: flight_status airline_code flight_number (optional)_offset_of_departure_date_from_today So for instance, to track a flight which departed yesterday, the optional 3rd parameter should have a value of -1. eg. flight_status ua 3655 -1 output --------- Status: Arrived Departure: San Francisco, CA (SFO) Scheduled: 6:30 AM, Jan 3 Takeoff: 7:18 AM, Jan 3 Term-Gate: Term 1 - 32A Arrival: Newark, NJ (EWR) Scheduled: 2:55 PM, Jan 3 At Gate: 3:42 PM, Jan 3 Term-Gate: Term C - C131 Note: html2text needs to be installed for this command. only tested on ubuntu 9.10 Show Sample Output


    19
    flight_status() { if [[ $# -eq 3 ]];then offset=$3; else offset=0; fi; curl "http://mobile.flightview.com/TrackByRoute.aspx?view=detail&al="$1"&fn="$2"&dpdat=$(date +%Y%m%d -d ${offset}day)" 2>/dev/null |html2text |grep ":"; }
    suhasgupta · 2010-01-04 15:49:09 14
  • Tested in Linux and OSX Show Sample Output


    18
    lsof -Pni4 | grep LISTEN
    evenme · 2009-08-21 22:51:41 44
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Update your OpenDNS network ip
Intended for dynamic ip OpenDNS users, this command will update your OpenDNS network IP. For getting your IP, you can use one of the many one-liners here on commandlinefu. Example: I use this in a script which is run by kppp after it has successfully connected to my ISP: --- #!/bin/bash IP="`curl -s http://checkip.dyndns.org/ | grep -o '[[:digit:].]\+'`" PW="hex-obfuscated-pw-here" if [ "$IP" == "" ] ; then echo 'Not online.' ; exit 1 else wget -q --user=topsecret --password="`echo $PW | xxd -ps -r`" 'https://updates.opendns.com/nic/update?hostname=myhostname&myip='"$IP" -O - /etc/init.d/ntp-client restart & fi --- PS: DynDNS should use a similar method, if you know the URL, please post a comment. (Something with members.dyndns.org, if I recall correctly)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Factory reset your harddrive. (BE CAREFUL!)

Get a range of SVN revisions from svn diff and tar gz them
Handy when you need to create a list of files to be updated when subversion is not available on the remote host. You can take this tar file, and upload and extract it where you need it. Replace M and N with the revisions specific to yours. Make sure you do this from an updated (svn up) working directory.

show the real times iso of epochs for a given column
When you have one of those (log)files that only has epoch for time (since no one will ever look at them as a date) this is a way to get the human readable date/time and do further inspection. Mostly perl-fu :-/

Dump audio from video without re-encoding.
This removes the video and subsequent file size and directly copies the audio.

Display which distro is installed
Works on nearly all linux distros

Recursively remove all subversion folders

Detect illegal access to kernel space, potentially useful for Meltdown detection
Based on capsule8 agent examples, not rigorously tested

a short counter
Maybe you know shorter ?


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: