Commands tagged google (68)

  • translate <phrase> <source-language> <output-language> works from command line


    2
    cmd=$( wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; ); echo "$cmd"
    dtolj · 2010-03-13 01:09:00 50
  • Just add this to your .bashrc file. Use quotes when query has multiple word length. Show Sample Output


    2
    findlocation() { place=`echo $1 | sed 's/ /%20/g'` ; curl -s "http://maps.google.com/maps/geo?output=json&oe=utf-8&q=$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
    shadyabhi · 2010-10-18 21:11:42 15

  • 2
    say() { curl -sA Mozilla -d q=`python3 -c 'from urllib.parse import quote_plus; from sys import stdin; print(quote_plus(stdin.read()[:100]))' <<<"$@"` 'http://translate.google.com/translate_tts' | mpg123 -q -; }
    kev · 2011-11-26 09:18:16 20
  • (1) required: python-googl ( install by: pip install python-googl ) (2) get from google API console https://code.google.com/apis/console/ Show Sample Output


    2
    python -c 'import googl; print googl.Googl("<your_google_api_key>").shorten("'$someurl'")[u"id"]'
    shr386 · 2012-05-31 17:14:17 3

  • 2
    say() { wget -q -U Mozilla -O output.mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=$1" open output.mp3 &>/dev/null || xdg-open output.mp3 &>/dev/null }
    runvnc · 2014-04-17 07:35:49 8
  • Searches Google, but requires no "", and will also search all terms input in the CL, eg: > google foo bar returns search URL " You could also use awk to replace all spaces with a +, which is how the Google search handles spaces, but that makes it more than one line.


    2
    function google () { st="$@"; open "http://www.google.com/search?q=${st}"; }
    plasticphyte · 2014-05-07 03:14:05 23
  • Improved google text-to-speech function. Allows to specify language, plays sound in terminal. Automatically removes downloaded file after successfully processing. Usage: say LANGUAGE TEXT Examples: say en "This is a test." say pl "To jest test"


    2
    function say { wget -q -U Mozilla -O google-tts.mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=$1&q=$2" open google-tts.mp3 &>/dev/null || mplayer google-tts.mp3 &>/dev/null; rm google-tts.mp3; }
    Zath · 2014-08-01 23:43:16 16
  • Access a random news web page on the internet. The Links browser can of course be replaced by Firefox or any modern graphical web browser.


    2
    links $( a=( $( lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
    pascalvaucheret · 2016-07-26 11:52:12 54

  • 2
    nslookup -q=TXT _netblocks.google.com | grep -Po '\b([0-1]?\d{1,2}|2[0-4]\d|25[0-5])(\.([0-1]?\d{1,2}|2[0-4]\d|25[0-5])){3}(/\d{1,2})\b'
    emphazer · 2018-10-05 12:50:48 374

  • 1
    spellcheck(){ curl -sd "<spellrequest><text>$1</text></spellrequest>" https://www.google.com/tbproxy/spell | sed 's/.*<spellresult [^>]*>\(.*\)<\/spellresult>/\1/;s/<c \([^>]*\)>\([^<]*\)<\/c>/\1;\2\n/g' | grep 's="1"' | sed 's/^.*;\([^\t]*\).*$/\1/'; }
    matthewbauer · 2010-02-17 01:55:28 14
  • Full Command: google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx'|grep -oP '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort You'll need googlecl and python-gdata. First setup google cl via: google Then give your PC access google contacts list name,email Then do the command, save it or use this one to dump it in the cone-address.txt file in your home dir: google contacts list name,name,email | perl -p -n -e 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' | grep -o -P '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort > ~/cone-adress.txt Then import into cone. It filters out multiple emails, and contacts with no email that have N/A. (Picasa photo persons without email for example...) Show Sample Output


    1
    google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' #see below for full command
    Raymii · 2010-07-12 16:50:44 148
  • opens the Google I'm Feeling Lucky result in lynx, the command line browser


    1
    lucky(){ url=$(echo "http://www.google.com/search?hl=en&q=$@&btnI=I%27m+Feeling+Lucky&aq=f&oq=" | sed 's/ /+/g'); lynx $url; }; lucky "Emperor Norton"
    smop · 2010-08-13 00:23:25 6
  • Alternative to http://commandlinefu.com/commands/view/6831/find-co-ordinates-of-a-location with $* instead of $1 so no need to quote multi-word locations Show Sample Output


    1
    findlocation() { place=`echo $* | sed 's/ /%20/g'` ; curl -s "http://maps.google.com/maps/geo?output=json&oe=utf-8&q=$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
    nimmylebby · 2010-10-18 21:38:20 5
  • Simple edit to work for OSX. Now just add this to your ~/.profile and `source ~/.profile`


    1
    rtfm() { help $@ || man $@ || open "http://www.google.com/search?q=$@"; }
    vaporub · 2011-01-26 06:23:42 5
  • wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).


    1
    wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 16
  • Usage: say hello world how are you today


    1
    say() { local IFS=+;mplayer "http://translate.google.com/translate_tts?q=$*"; }
    RanyAlbeg · 2011-09-08 13:02:46 3
  • This command will place symbolic links to files listed in an m3u playlist into a specified folder. Useful for uploading playlists to Google Music. prefix = The full path prefix to file entries in your .m3u file, if the file paths are relative. For example, if you have "Music/folder/song.mp3" in your list.m3u, you might want to specify "/home/username" as your prefix. list.m3u = Path to the playlist target_folder = Path to the target folder in which you would like to create symlinks


    1
    (IFS=$'\n'; ln -sf $(awk '((NR % 2) != 0 && NR > 1) {print "prefix" $0}' list.m3u) target_folder)
    lxe · 2011-09-25 16:45:28 6
  • Get the first 10 google results form a querry, but showing only the urls from the results. Use + to search diferent terms, ex: commandlinefu+google . Show Sample Output


    1
    gg(){ lynx -dump http://www.google.com/search?q=$@ | sed '/[0-9]*\..http:\/\/www.google.com\/search?q=related:/!d;s/...[0-9]*\..http:\/\/www.google.com\/search?q=related://;s/&hl=//';}
    chon8a · 2012-04-21 03:31:26 6

  • 1
    Q="Hello world"; GOOG_URL="http://www.google.com/search?q="; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}\"${Q/\ /+}\"" | grep -oP '\/url\?q=.+?&amp' | sed 's/\/url?q=//;s/&amp//'); echo -e "${stream//\%/\x}"
    westeros91 · 2012-08-26 20:13:21 8
  • Same as the other rtfm's, but using the more correct xdg-open instead of $BROWSER. I can't find a way to open info only if the term exists, so it stays out of my version.


    1
    rtfm() { help $@ || man $@ || xdg-open "http://www.google.com/search?q=$@"; }
    KlfJoat · 2014-04-25 04:17:03 96
  • translate <some phrase> [output-language] [source-language] 1) "some phrase" should be in quotes 2) [output-language] - optional (default: English) 3) [source-language] - optional (default: auto) translate "bonjour petit lapin" hello little rabbit translate "bonjour petit lapin" en hello little rabbit translate "bonjour petit lapin" en fr hello little rabbit Show Sample Output


    1
    translate(){wget -U "Mozilla/5.0" -qO - "https://translate.google.com/translate_a/single?client=t&sl=${3:-auto}&tl=${2:-en}&dt=t&q=$1" | cut -d'"' -f2}
    klisanor · 2014-06-10 12:08:51 11
  • sort -R randomize the list. head -n1 takes the first.


    1
    links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
    mogoh · 2016-07-26 12:54:53 15
  • a bit shorter, parenthesis not needed but added for clarity Show Sample Output


    1
    nslookup -q=TXT _netblocks.google.com | grep -Eo 'ip4:([0-9\.\/]+)' | cut -d: -f2
    jseppe · 2018-10-05 18:19:15 377
  • substitute the URL with your private/public XML url from calendar sharing settings substitute the dates YYYY-mm-dd adjust the perl parsing part for your needs Show Sample Output


    0
    wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane '@m=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
    unixmonkey4704 · 2009-07-23 14:48:54 4
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output


    0
    wget -q -U busybox -O- "http://www.google.com/search?ie=UTF8&q=define%3A$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 9
  •  < 1 2 3 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Transfer SSH public key to another machine in one step
This command sequence allows simple setup of (gasp!) password-less SSH logins. Be careful, as if you already have an SSH keypair in your ~/.ssh directory on the local machine, there is a possibility ssh-keygen may overwrite them. ssh-copy-id copies the public key to the remote host and appends it to the remote account's ~/.ssh/authorized_keys file. When trying ssh, if you used no passphrase for your key, the remote shell appears soon after invoking ssh user@host.

Write comments to your history.
A null operation with the name 'comment', allowing comments to be written to HISTFILE. Prepending '#' to a command will *not* write the command to the history file, although it will be available for the current session, thus '#' is not useful for keeping track of comments past the current session.

Read aloud a text file in Mac OS X

Count number of files in a directory
Just want to post a Perl alternative. Does not count hidden files ('.' ones).

Download all PDFs from an authenificated website
Replace *** with the appropiate values

Multi-thread any command
For instance: $ find . -type f -name '*.wav' -print0 |xargs -0 -P 3 -n 1 flac -V8 will encode all .wav files into FLAC in parallel. Explanation of xargs flags: -P [max-procs]: Max number of invocations to run at once. Set to 0 to run all at once [potentially dangerous re: excessive RAM usage]. -n [max-args]: Max number of arguments from the list to send to each invocation. -0: Stdin is a null-terminated list. I use xargs to build parallel-processing frameworks into my scripts like the one here: http://pastebin.com/1GvcifYa

disable caps lock
a quick one-line way to disable caps lock while running X.

Get a qrcode for a given string

Setting reserved blocks percentage to 1%
According to tune2fs manual, reserved blocks are designed to keep your system from failing when you run out of space. Its reserves space for privileged processes such as daemons (like syslogd, for ex.) and other root level processes; also the reserved space can prevent the filesystem from fragmenting as it fills up. By default this is 5% regardless of the size of the partition. http://www.ducea.com/2008/03/04/ext3-reserved-blocks-percentage/

Show apps that use internet connection at the moment.
show only the name of the apps that are using internet


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: