Commands tagged google (67)

  • Just add this to your .bashrc file. Use quotes when query has multiple word length. Show Sample Output

    findlocation() { place=`echo $1 | sed 's/ /%20/g'` ; curl -s "$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
    shadyabhi · 2010-10-18 21:11:42 0

  • 2
    say() { curl -sA Mozilla -d q=`python3 -c 'from urllib.parse import quote_plus; from sys import stdin; print(quote_plus([:100]))' <<<"$@"` '' | mpg123 -q -; }
    kev · 2011-11-26 09:18:16 5

  • 2
    say() { wget -q -U Mozilla -O output.mp3 "$1" open output.mp3 &>/dev/null || xdg-open output.mp3 &>/dev/null }
    runvnc · 2014-04-17 07:35:49 0
  • Searches Google, but requires no "", and will also search all terms input in the CL, eg: > google foo bar returns search URL " You could also use awk to replace all spaces with a +, which is how the Google search handles spaces, but that makes it more than one line.

    function google () { st="$@"; open "${st}"; }
    plasticphyte · 2014-05-07 03:14:05 0
  • Improved google text-to-speech function. Allows to specify language, plays sound in terminal. Automatically removes downloaded file after successfully processing. Usage: say LANGUAGE TEXT Examples: say en "This is a test." say pl "To jest test"

    function say { wget -q -U Mozilla -O google-tts.mp3 "$1&q=$2" open google-tts.mp3 &>/dev/null || mplayer google-tts.mp3 &>/dev/null; rm google-tts.mp3; }
    Zath · 2014-08-01 23:43:16 0
  • Access a random news web page on the internet. The Links browser can of course be replaced by Firefox or any modern graphical web browser.

    links $( a=( $( lynx -dump -listonly "" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
    pascalv · 2016-07-26 11:52:12 3

  • 2
    nslookup -q=TXT | grep -Po '\b([0-1]?\d{1,2}|2[0-4]\d|25[0-5])(\.([0-1]?\d{1,2}|2[0-4]\d|25[0-5])){3}(/\d{1,2})\b'
    emphazer · 2018-10-05 12:50:48 0

  • 1
    spellcheck(){ curl -sd "<spellrequest><text>$1</text></spellrequest>" | sed 's/.*<spellresult [^>]*>\(.*\)<\/spellresult>/\1/;s/<c \([^>]*\)>\([^<]*\)<\/c>/\1;\2\n/g' | grep 's="1"' | sed 's/^.*;\([^\t]*\).*$/\1/'; }
    matthewbauer · 2010-02-17 01:55:28 1
  • Full Command: google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx'|grep -oP '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort You'll need googlecl and python-gdata. First setup google cl via: google Then give your PC access google contacts list name,email Then do the command, save it or use this one to dump it in the cone-address.txt file in your home dir: google contacts list name,name,email | perl -p -n -e 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' | grep -o -P '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort > ~/cone-adress.txt Then import into cone. It filters out multiple emails, and contacts with no email that have N/A. (Picasa photo persons without email for example...) Show Sample Output

    google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' #see below for full command
    Raymii · 2010-07-12 16:50:44 3
  • opens the Google I'm Feeling Lucky result in lynx, the command line browser

    lucky(){ url=$(echo "$@&btnI=I%27m+Feeling+Lucky&aq=f&oq=" | sed 's/ /+/g'); lynx $url; }; lucky "Emperor Norton"
    smop · 2010-08-13 00:23:25 1
  • Alternative to with $* instead of $1 so no need to quote multi-word locations Show Sample Output

    findlocation() { place=`echo $* | sed 's/ /%20/g'` ; curl -s "$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
    nimmylebby · 2010-10-18 21:38:20 2
  • Simple edit to work for OSX. Now just add this to your ~/.profile and `source ~/.profile`

    rtfm() { help $@ || man $@ || open "$@"; }
    vaporub · 2011-01-26 06:23:42 0
  • wget -qO - ",de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).

    wget -qO - ",de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 0
  • Usage: say hello world how are you today

    say() { local IFS=+;mplayer "$*"; }
    RanyAlbeg · 2011-09-08 13:02:46 0
  • This command will place symbolic links to files listed in an m3u playlist into a specified folder. Useful for uploading playlists to Google Music. prefix = The full path prefix to file entries in your .m3u file, if the file paths are relative. For example, if you have "Music/folder/song.mp3" in your list.m3u, you might want to specify "/home/username" as your prefix. list.m3u = Path to the playlist target_folder = Path to the target folder in which you would like to create symlinks

    (IFS=$'\n'; ln -sf $(awk '((NR % 2) != 0 && NR > 1) {print "prefix" $0}' list.m3u) target_folder)
    lxe · 2011-09-25 16:45:28 2
  • Get the first 10 google results form a querry, but showing only the urls from the results. Use + to search diferent terms, ex: commandlinefu+google . Show Sample Output

    gg(){ lynx -dump$@ | sed '/[0-9]*\..http:\/\/\/search?q=related:/!d;s/...[0-9]*\..http:\/\/\/search?q=related://;s/&hl=//';}
    chon8a · 2012-04-21 03:31:26 3
  • (1) required: python-googl ( install by: pip install python-googl ) (2) get from google API console Show Sample Output

    python -c 'import googl; print googl.Googl("<your_google_api_key>").shorten("'$someurl'")[u"id"]'
    shr386 · 2012-05-31 17:14:17 0

  • 1
    Q="Hello world"; GOOG_URL=""; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}\"${Q/\ /+}\"" | grep -oP '\/url\?q=.+?&amp' | sed 's/\/url?q=//;s/&amp//'); echo -e "${stream//\%/\x}"
    westeros91 · 2012-08-26 20:13:21 0
  • Same as the other rtfm's, but using the more correct xdg-open instead of $BROWSER. I can't find a way to open info only if the term exists, so it stays out of my version.

    rtfm() { help $@ || man $@ || xdg-open "$@"; }
    KlfJoat · 2014-04-25 04:17:03 0
  • translate <some phrase> [output-language] [source-language] 1) "some phrase" should be in quotes 2) [output-language] - optional (default: English) 3) [source-language] - optional (default: auto) translate "bonjour petit lapin" hello little rabbit translate "bonjour petit lapin" en hello little rabbit translate "bonjour petit lapin" en fr hello little rabbit Show Sample Output

    translate(){wget -U "Mozilla/5.0" -qO - "${3:-auto}&tl=${2:-en}&dt=t&q=$1" | cut -d'"' -f2}
    klisanor · 2014-06-10 12:08:51 0
  • sort -R randomize the list. head -n1 takes the first.

    links `lynx -dump -listonly "" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "" | sort -R | uniq | head -n1`
    mogoh · 2016-07-26 12:54:53 1
  • a bit shorter, parenthesis not needed but added for clarity Show Sample Output

    nslookup -q=TXT | grep -Eo 'ip4:([0-9\.\/]+)' | cut -d: -f2
    jseppe · 2018-10-05 18:19:15 2
  • substitute the URL with your private/public XML url from calendar sharing settings substitute the dates YYYY-mm-dd adjust the perl parsing part for your needs Show Sample Output

    wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane '@m=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
    unixmonkey4704 · 2009-07-23 14:48:54 1
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output

    wget -q -U busybox -O- "$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 1
  • Check your local temperature based on geolocation. Show Sample Output

    curl -s$(curl -s$(curl -s | sed -e'1d;3d' -e's/C.*: \(.*\)/\1/' -e's/ /%20/g' -e"s/'/%27/g") | sed 's|.*<t.*f data="\([^"]*\)"/>.*|\1\n|'
    o0110o · 2010-02-14 19:44:54 1
  •  < 1 2 3 > 

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Check These Out

Unlock VMs in Proxmox
Unlock your VMS to avoid problems after some failed tasks ended.

Create backup copy of file, adding suffix of the date of the file modification (NOT today's date)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Show Apt/Dpkg configuration
Shows all configurations to apt and dpkg, rarely changed, you probably still have the default configuration. Go ahead and explore your configuration if you dare, perhaps change your apt-cache directory, Dir::Cache "var/cache/apt/"; or the names of the log files.

Get the date for the last Saturday of a given month
If your locale has Monday as the first day of the week, like mine in the UK, change the two $7 into $6

Takes and displays screenshot of Android phone over adb.
Dependencies on phone: adb access, screencap command, base64 command. Dependencies on computer: adb, sed, base64, display (from imagemagick, but can substitute other image viewer which reads from stdin). This should work around adb stupidies (i.e. that it replaces \n with \r\n) with base64.

print a python-script (or any other code) with syntax-highlighting and no loss of indentation

Mouse Tracking
Will track your mouse and save it to a file. You can use gnuplot to graph it: $ gnuplot -persist

Get IPv4 of eth0 for use with scripts
Simple and easy. No regex, no search and replace. Just clean, built-in tools.

Install pip with Proxy
Installs pip packages defining a proxy

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: