Commands tagged google (67)

  • Usage: translate <phrase> <source-language> <output-language> Example: translate hello en es See this for a list of language codes: http://en.wikipedia.org/wiki/List_of_ISO_639-1_codes Show Sample Output


    65
    translate(){ wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; }
    matthewbauer · 2010-03-08 03:15:48 9
  • RTFMFTW.


    39
    rtfm() { help $@ || man $@ || $BROWSER "http://www.google.com/search?q=$@"; }
    hunterm · 2011-01-05 02:53:38 2
  • Google just released a new commend line tool offering all sorts of new services from the commend line. One of them is uploading a youtube video but there are plenty more google services to interact with. Download it here: http://code.google.com/p/googlecl/ Manual: http://code.google.com/p/googlecl/wiki/Manual This specific command courtesy of lifehacker:http://lifehacker.com/5568817/ Though all can be found in manual page linked above. Show Sample Output


    38
    google docs edit --title "To-Do List" --editor vim
    spiffwalker · 2010-06-21 16:15:42 3
  • EDIT: command updated to support accented characters! Works in any of 58 google supported languages (some sound like crap, english is the best IMO). You get a mp3 file containing your query in spoken language. There is a limit of 100 characters for the "q" parameter, so be careful. The "tl" parameter contains target language.


    37
    wget -q -U Mozilla -O output.mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=hello+world
    sairon · 2011-03-08 14:05:36 12
  • This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';} If your version of grep doesn't have perl compatible regex support, then you can use this version: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;} Show Sample Output


    18
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
    eightmillion · 2010-01-29 05:01:11 4
  • Usage examples: say hello say "hello world" say hello+world


    14
    say() { mplayer "http://translate.google.com/translate_tts?q=$1"; }
    daa · 2011-09-08 03:34:24 2
  • Put it in your ~/.bashrc usage: google word1 word2 word3... google '"this search gets quoted"' Show Sample Output


    13
    function google { Q="$@"; GOOG_URL='https://www.google.de/search?tbs=li:1&q='; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}${Q//\ /+}" | grep -oP '\/url\?q=.+?&amp' | sed 's|/url?q=||; s|&amp||'); echo -e "${stream//\%/\x}"; }
    michelsberg · 2013-04-05 08:04:15 2
  • Usage: t2s 'How are you?' Nice because it automatically names the mp3 file up to 15 characters Modified (uses bash manip instead of tr) t2s() { wget -q -U Mozilla -O $(cut -b 1-15


    13
    t2s() { wget -q -U Mozilla -O $(tr ' ' _ <<< "$1"| cut -b 1-15).mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=$(tr ' ' + <<< "$1")"; }
    snipertyler · 2013-10-16 23:29:59 1

  • 12
    say(){ mplayer -user-agent Mozilla "http://translate.google.com/translate_tts?tl=en&q=$(echo $* | sed 's#\ #\+#g')" > /dev/null 2>&1 ; }
    return13 · 2011-03-13 21:10:26 1
  • This microscript looks up a man page for each word possible, and if the correct page is not found, uses w3m and Google's "I'm feeling lucky" to output a first possible result. This script was made as a result of an idea on a popular Linux forum, where users often send other people to RTFM by saying something like "man backup" or "man ubuntu one". To make this script replace the usual man command, save it as ".man.sh" in your home folder and add the following string to the end of your .bashrc file: alias man='~/.man.sh' Show Sample Output


    9
    /usr/bin/man $* || w3m -dump http://google.com/search?q="$*"\&btnI | less
    d1337r · 2010-10-05 13:51:39 0
  • Some commands have more information on 'info' than in the man pages


    9
    rtfm() { help $@ || info $@ || man $@ || $BROWSER "http://www.google.com/search?q=$@"; }
    seattlegaucho · 2011-01-05 21:26:51 1
  • I found this command on a different site and thought you guy might enjoy it. Just change "YOURSEARCH" to what ever you want to search. Example, "Linux Commands"


    9
    Q="YOURSEARCH"; GOOG_URL="http://www.google.com/search?q="; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}\"${Q/\ /+}\"" | grep -oP '\/url\?q=.+?&amp' | sed 's/\/url?q=//;s/&amp//'); echo -e "${stream//\%/\x}"
    techie · 2013-04-03 09:56:41 5
  • allow multiword translations Show Sample Output


    8
    translate() { lng1="$1";lng2="$2";shift;shift; wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=${@// /+}&langpair=$lng1|$lng2" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; }
    bandie91 · 2010-05-02 22:15:30 1
  • This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed). Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489


    7
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
    gthb · 2010-01-30 13:08:03 1
  • I took matthewbauer's cool one-liner and rewrote it as a shell function that returns all the suggestions or outputs "OK" if it doesn't find anything wrong. It should work on ksh, zsh, and bash. Users that don't have tee can leave that part off like this: spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://google.com/tbproxy/spell|sed -n '/s="[1-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}';} Show Sample Output


    5
    spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://www.google.com/tbproxy/spell|sed -n '/s="[0-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}'|tee >(grep -Eq '.*'||echo -e "OK");}
    eightmillion · 2010-02-17 08:20:48 4
  • Use curl and sed to shorten an URL using goo.gl without any other api Show Sample Output


    5
    curl -s -d'&url=URL' http://goo.gl/api/url | sed -e 's/{"short_url":"//' -e 's/","added_to_history":false}/\n/'
    Soubsoub · 2010-10-01 23:20:08 2
  • substitute "example" with desired string; tl = target language (en, fr, de, hu, ...); you can leave sl parameter as-is (autodetection works fine) Show Sample Output


    5
    wget -U "Mozilla/5.0" -qO - "http://translate.google.com/translate_a/t?client=t&text=translation+example&sl=auto&tl=fr" | sed 's/\[\[\[\"//' | cut -d \" -f 1
    sairon · 2011-03-06 13:46:16 0
  • Shorter and made into a function. Show Sample Output


    4
    googl () { curl -s -d "url=${1}" http://goo.gl/api/url | sed -n "s/.*:\"\([^\"]*\).*/\1\n/p" ;}
    dabom · 2010-10-03 02:52:44 0
  • Sometimes you don't have man pages only '-h' or '--help'.


    4
    rtfm() { help $@ || $@ -h || $@ --help || man $@ || $BROWSER "http://www.google.com/search?q=$@"; }
    karol · 2011-01-05 17:36:26 0
  • same but redirecting to player and putting whaever text line.. works on my ubuntu machine ...


    4
    p=$(echo "hello world, how r u?"|sed 's/ /+/g');wget -U Mozilla -q -O - "$@" translate.google.com/translate_tts?tl=en\&q=$p|mpg123 -
    jhansen · 2011-09-19 23:06:15 0
  • use curl and sed to shorten an url via goo.gl


    3
    curl -s 'http://ggl-shortener.appspot.com/?url='"$1" | sed -e 's/{"short_url":"//' -e 's/"}/\n/g'
    mvrilo · 2010-03-26 22:31:06 4
  • The FLAC audio must be encoded at 16000Hz sampling rate (SoX is your friend). Outputs a short JSON string, the actual speech is in the hypotheses->utterance, the accuracy is stored in hypotheses->confidence (ranging from 0 to 1). Google also accepts audio in some special speex format (audio/x-speex-with-header-byte), which is much smaller in comparison with losless FLAC, but I haven't been able to encode such a sample. Show Sample Output


    3
    wget -q -U "Mozilla/5.0" --post-file speech.flac --header="Content-Type: audio/x-flac; rate=16000" -O - "http://www.google.com/speech-api/v1/recognize?lang=en-us&client=chromium"
    sairon · 2011-03-08 13:39:01 0
  • Usage: google "[search string]" Example: google "something im searching for" This will launch firefox and execute a google search in a new tab with the provided search string. You must provide the path to your Firefox binary if using cygwin to $ff or create an alias like follows: alias firefox='/cygdrive/c/Program Files (x86)/Mozilla Firefox/firefox.exe' Most Linux flavors with Firefox installed will use just ff="firefox" and even OSX.


    3
    google() { gg="https://www.google.com/search?q="; ff="firefox"; if [[ $1 ]]; then "$ff" -new-tab "$gg"$(echo ${1//[^a-zA-Z0-9]/+}); else echo 'Usage: google "[seach term]"'; fi }
    lowjax · 2013-08-01 22:21:53 5
  • This is a basis for other Google API commands.


    2
    curl -s https://www.google.com/accounts/ClientLogin -d Email=$email -d Passwd=$password -d service=lh2 | grep Auth | sed 's/Auth=\(.*\)/\1/'
    matthewbauer · 2010-02-04 03:34:54 2
  • translate <phrase> <source-language> <output-language> works from command line


    2
    cmd=$( wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; ); echo "$cmd"
    dtolj · 2010-03-13 01:09:00 0
  •  1 2 3 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Remove several files with ease
Rather than typing out all 10 files, you can use brace expansion to do the trick for you. This is useful for backup files, numbered files, or any files with a repeating pattern. Gives more control than 'rm file*' as I might want to keep others around.

Get a BOFH excuse
Almost same output with fewer typing... OP had a great idea : BOFH !!!

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Reuse last parameter
Reuse the last parameter of the previous command line

Find the package that installed a command

Execute a command on logout
Execute a command on shell logout,and run it until is finished,then shell is closed.

Browse system RAM in a human readable form
This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc). Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here! The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable"). If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...). Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)... $sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})' (depending on sudo settings it might be best to run $sudo su first to get to a # prompt)

Find out how old a web page is
I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified. You can use curl instead if you have it installed, like this: $ curl --head -s http://osswin.sourceforge.net | grep Mod

Easy file sharing from the command line using transfer.sh
Requires: curl xsel access to the internet(http://transfer.sh) This is an alias utilizing the transfer.sh service to make sharing files easier from the command line. I have modified the alias provided by transfer.sh to use xsel to copy the resulting URL to the clipboard. The full modified alias is as follows since commandlinefu only allows 255 characters: transfer() { if [ $# -eq 0 ]; then echo "No arguments specified. Usage:\necho transfer /tmp/test.md\ncat /tmp/test.md | transfer test.md"; return 1; fi if tty -s; then basefile=$(basename "$1" | sed -e 's/[^a-zA-Z0-9._-]/-/g'); curl --progress-bar --upload-file "$1" "https://transfer.sh/$basefile" |xsel --clipboard; else curl --progress-bar --upload-file "-" "https://transfer.sh/$1" |xsel --clipboard ; fi; xsel --clipboard; }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: