More Google Glasses Patents: Beyond the Design

Sharing is caring!

Google’s Project Glass seems to be moving closer and closer to reality, with the granting of 7 more patents today. Last week, I pointed out 4 patents related to the project in Google Glasses Design Patents and Other Wearables. Of those, 3 were design patents filed to protect the look and feel of the glasses, and the fourth patent described a way of using an infrared (IR) reflective surface on rings or gloves or even fingernails to provide input for the eyeglass display device. The patents granted today include only 1 design patent, and 6 patents that describe some of the more technical details about how Google’s Heads Up Display might work.

The First patent is a design patent from inventors who worked on the three design patents granted last week, Matthew Wyatt Martin and Maj Isabelle Olsson (Mitchell Joseph Heinrich was a co-inventor of one of the earlier three).

Another potential version of how Google Glasses might look.

Wearable display device section
Invented by Maj Isabelle Olsson and Matthew Wyatt Martin
Assigned to Google
Granted May 22, 2012
Filed: October 26, 2011
CLAIM: The ornamental design for a wearable display device section, as shown and described.

Some of the comments that I’ve seen from people about Google Glasses is the concern that people might walk into the side of a building or in front of a bus while watching a virtual world through Google Glasses. This first patent describes how a pair of glasses like these might instead act to augment and improve our perception of the real world around us by telling us about sounds, the direction they are coming from, and the intensity of those sounds. We’re told that this application might also help people who might have difficulty hearing, or be hearing impaired.

Someone attempting to cross a street might not be aware of an approaching car honking to warn that pedestrian. The glasses could indicate the direction the honks are coming from, and how intense they might be to “indicate how close the oncoming car is to the user.”

Displaying sound indications on a wearable computing system
Invented by Adrian Wong and Xiaoyu Miao
Assigned to Google
US Patent 8,183,997
Granted May 22, 2012
Filed: November 14, 2011


Example methods and systems for displaying one or more indications that indicate (i) the direction of a source of sound and (ii) the intensity level of the sound are disclosed. A method may involve receiving audio data corresponding to sound detected by a wearable computing system.

Further, the method may involve analyzing the audio data to determine both (i) a direction from the wearable computing system of a source of the sound and (ii) an intensity level of the sound. Still further, the method may involve causing the wearable computing system to display one or more indications that indicate (i) the direction of the source of the sound and (ii) the intensity level of the sound.

There are a number of different types of sensors that might be used with a pair of glasses, and those might be part of a nose bridge, primarily because it’s part of the glasses that are forward facing (as opposed to the sidearms of the glasses. Some of the different types of sensors could include a video camera, a sonar sensor, an ultrasonic, and possibly a microphone that might monitor patterns associated with breathing.

A sensor between two sides of a nose bridge might also recognize the appearance of a nose between them, and turn on the glasses when that happens.

The patent mentions the possibility of finger-operable touch pad input devices on the sidearms of the glasses. As for the display, it could be:

…a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user’s eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user. A corresponding display driver may be disposed within the frame elements 104 and 106 for driving such a matrix display. Alternatively or additionally, a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user’s eyes.

Nose bridge sensor
Invented by Max Braun, Ryan Geiss, Harvey Ho, Thad Eugene Starner, Gabriel Taubman
Assigned to Google
US Patent 8,184,067
Granted May 22, 2012
Filed: July 20, 2011


Systems and methods for selecting an action associated with a power state transition of a head-mounted display (HMD) in the form of eyeglasses are disclosed. A signal may be received from a sensor on a nose bridge of the eyeglasses indicating if the HMD is in use. Based on the received signal, a first powers state for the HMD may be determined. Responsive to the determined first power state, an action associated with a power state transition of the HMD from an existing power state to the first power state may be selected.

The action may be selected from among a plurality of actions associated with a plurality of state transitions. Also, the action may be a sequence of functions performed by the HMD including modifying an operating state of a primary processing component of the HMD and a detector of the HMD configured to image an environment.

The next patent describes some of the display aspects of these devices, as they might be implemented in “helmet, contact lens, goggles, and glasses.” It tells us about things such as how foreground and background images might be displayed, with the foreground images in sharper focus than the background images.

Processing objects for separate eye displays
Invented by Charles C. Rhodes and Babak Amirparviz
Assigned to Google
US Patent 8,184,068
Granted May 22, 2012
Filed: August 23, 2011


Disclosed are embodiments for methods and devices for displaying images. In some example embodiments, methods may include receiving data corresponding to an image with a processor. The image data may include at least one image object. In additional example embodiments, each image object may be assigned to either a foreground image set or a background image set using a processor, for example. An example embodiment may also include rendering a first display image based on at least the foreground image set. The first display image may include the objects assigned to the foreground image set.

Additionally, the objects assigned to the foreground image set may be in focus in the first display image. Embodiments may also include rendering a second display image based on at least the background image set. The second display image may include the objects assigned to the background image set. Additionally, the objects assigned to the background image set may be in focus in the second display image.

When we’re looking forward and we have a wide field of vision in front of us, our actual gaze might be directed towards something specifically within that field of vision. The glasses might track where we are actually looking to prioritize what is within that visual area. Head movements, such as turning our heads to the right or left, might also inform the glasses of a change of visual attention, and audio samples of ambient audio miught also trigger an application that might be based upon direction of attention.

We’re told that this might be used with a number of different applications, but not given any examples of what those might be.

Systems and methods for adaptive transmission of data
Invented by Charles C. Rhodes
Assigned to Google
US Patent 8,184,069
Granted May 22, 2012
Filed: June 20, 2011


The present disclosure describes systems and methods for transmitting, receiving, and displaying data. The systems and methods may be directed to providing a constant or substantially constant data transmission rate (e.g., frame rate per second) to a device and controlling bandwidth by presenting information directed to an area of interest to a user.

Bandwidth can be lowered, for example by presenting high resolution data directed to the area of interest to the user (e.g., an area to which the user is looking or “gazing” using a heads-up display), and lower resolution data directed to other areas. Data can be transmitted and received at a constant frame rate or substantially constant frame rate, and gaze direction and progressive compression/decompression techniques can be used to transmit data focused on areas directed to an area of interest to the user.

An accelerometer system, including a gyroscope and/or a compass built into the glasses might be able to understand the kinds of activities that a wearer is engaged in, such as “sitting, walking, running, traveling upstairs, and traveling downstairs.” The user interface shown to someone might vary based upon their activity.

Someone standing still might be more likely to find a data intensive display more useful than someone walking. Someone running might want an interface that may also include some statistics about their run.

Method and system for selecting a user interface for a wearable computing device
Invented by Gabriel Taubman
Assigned to Google
US Patent 8,184,070
Granted May 22, 2012
Filed: July 6, 2011


Example methods and systems for selecting a user interface for a wearable computing device are disclosed. An accelerometer system may determine a user activity of a user wearing a wearable computing device. Based on the user activity determined by the accelerometer system, the wearable computing device may select a user interface for the wearable computing device such that the user interface is appropriate for the determined user activity.

This last patent focuses primarily upon data transmission and reception of a wearable computing device, as well as how different body movements, audio commands, viewed gestures, and even touch commands might be used to send and receive data.

Wireless directional identification and subsequent communication between wearable electronic devices
Invented by Harvey Ho, Babak Amirparviz, Luis Ricardo Prada Gomez, and Thad Eugene Starner
Assigned to Google
US Patent 8,184,983
Granted May 22, 2012
Filed: June 9, 2011


Disclosed are methods, devices, and systems for exchanging information between a first wearable electronic device and one of a second wearable electronic device and an account at a remote computing device associated with a user of the second wearable electronic device. The first wearable electronic device intermittently emits directed electromagnetic radiation comprising a beacon signal, and receives, via a receiver coupled to the first wearable electronic device, a signal from the second wearable electronic device identifying one of the second wearable electronic device and the account at the remote computing device.

An input may then be detected at the first wearable electronic device, and in response to receiving the signal and detecting the input, the first wearable device may transmit additional data to one of the second wearable electronic device and the remote computing device associated with the second user.


These patents don’t contain all the answers about how Google’s Project Glass might work, but they provide a number of interesting possibilities and alternatives. I’m somewhat surprised at the attention to details, like tracking the actual gaze of someone looking through the lenses of these devices instead of just tracking where they are pointed. Or enhancing someone’s ability to understand where sounds are coming from around them, as well as the intensity of those sounds.

Sharing is caring!

23 thoughts on “More Google Glasses Patents: Beyond the Design”

  1. These glasses are going to be one of those ground breaking types of technology. Reading the excerpt about the potential hazards of wearing the glasses while trying to function normally, and the risks of running into things is a valid point. The disclaimer will probably be extensive, and rightly so. Can’t wait to try these out though…great post, thanks for sharing the informative break down!

  2. Hi Molly,

    I think it will be groundbreaking, too. It does look like Google’s trying to think of ways to avoid tragedy when people are viewing the world through their augmented reality glasses.

    For instance, one of the newly granted patents mentions the possibility of a sonar sensor, and another patent describes the ability to see visualizations of sounds, the direction they are coming from, and how loud or intense they might be. A third patent tells us that Google might use different user interfaces based upon the kinds of activities that someone is engaged in, as detected by an accelerometer, so that a more detailed view might be shown when someone is stationary, a less obtrusive view when they are walking, and an even less distracting interface when running.

    It’s also possible that some of the sensor technology that Google appears to have acquired foir self driving cars (or at least much safer cars) could also find its way into this kind of technology. If Google has dreams of making cars that drive themself safe, I would hope they they are able to come up with ways to keep people from walking into dangerous situations while wearing augmented reality glasses.

  3. I’m just waiting for the Google glasses app that allows you to see what others are seeing….hehehe. Like Skype for glasses. WHile you’re out at the store you can have your wife check to make sure you’re buying the right product just by looking in the corner of your vision to see what your husband is doing.

  4. Imagine the possibilities… when cars first came out, people said much of the same things. People will run into things and run over other people. There are going to be accidents and people will be killed, because they go too fast. Every possible reason for NOT having cars was mentioned. They were correct, all those things did happen to some extent, but not to people that paid attention. People were able to drive in lanes on the same road… even before lanes were there.

    I think people are thinking of their computers and all the data you see there, or a TV screen that shows you everything… I don’t think they are imagining at all what Google is talking about…

    So imagine having the sonar and audio and knowing in advance that a speeding car was coming down the street or it was sliding around, and you get a visual sensor while listening to music on your headphones that shows the danger, long before you even see it. It could give you distance, rate of speed, and show every person around you. Sonar could show people and buildings allowing blind people more freedom or deaf people a chance to see the world in a way never before thought possible.

    But those of us with all our senses in tact, might have little signals of things should know, to help us be more aware… answers to questions at our fingertips. See a sign, and maybe enhance it visually, so we can read it better or have it read to us, using voice. Or imagine looking outside and seeing a simple weather report, like in the video they created. I can see how people could actually be more productive and much safer than they are now, using this technology. People already wander around half blind and oblivious to the world around them… staring at their phones, texting each other, listening to music, etc… why not integrate that, so you can pay more attention, and have it be non-obtrusive…

    What an amazing world it might become… I only wish it was 20 years ago. 0.o

  5. Hi Justin,

    Me too! These glasses appear to actually “augment” reality instead of providing an alternative reality. The sonar sensors, the visual cues to sounds, and other features do seem to provide a way of understanding better the things that are actually going on in the world around us instead of distracting us from it. Definitely some interesting ideas floating around these glasses.

  6. Wow – augmented reality glasses – very futuristic but an awesome invention, can’t wait to test them. I agree with Justin all this is pretty hard to imagine but truly wonderful indeed.

  7. @SuperNerd:

    Ha, that’s a great comment, and I think something like that is a definite possibility, one day.

    I’m really looking forward to when these are released, but I don’t think we’ll see anything like this that actually works for another 10 years or so, unfortunately.

  8. I think the Google Glasses are going to be the start of something amazing. I think they might be a little slow to start off with but as technology increases and becomes more efficient, who knows, everyone might have a pair sooner or later.

  9. Hi Marie,

    I had some doubts, too. But the patents that I’m seeing rolling out makes me believe that Google is pretty serious about these glasses.

  10. Hi Charles,

    Those future is now kind of inventions seem to be popping up increasingly. Even things like smart phones that let you access more information than is held in the Library of Congress is amazing. Who would have thought that 20 years ago?

  11. Hi Louis,

    I’m finding myself surprised by some of the things that are showing up in the patents for Project Glasses. If the initial glasses offered have a fraction of what I’m seeing, they would be pretty amazing.

  12. I love this sort of stuff. It reads like something from a science fiction film, but it is all happening right now. However, the fact that I find this exciting and slightly Sci-fi makes me think that I am maybe only a few years away from relying on my children to program the DVD recorder or whatever…….the world is moving too fast and I am getting too old, too quickly!

  13. Hi Gary,

    I think Google is enjoying the fact that they are bringing some science fictional type ideas out into the world in tangible products like augmented reality glasses and self driving cars. Hopefully our DVD players/recorders will be capable of programming themselves before too long. 🙂

  14. Hi Wyatt.

    So far, I really like what I’m hearing about the glasses. I remember buying a sony walkman not long after it came out and being underwhelmed. I don’t expect that experience with these glasses.

  15. It can be a bit tough for Google to get more people using this Google Goggles technology, but in 5 years, we can surely expect a good percentage of people use it, least the geeks and the gadget freaks.

  16. As cool as these would be for now I think they’re going to fail somewhat. But we need them developed to bring some great stuff in a few years time. Just as the iPhone and iPad started a revolution so will the glasses…just not yet

  17. Hi Bill,

    It seems that my last comment was a bit naive… It’s looking like they may be with us as early as Q4 this year:

Comments are closed.