Last week, Google published a paper on a way of navigating on touch screens by tracing out alphanumeric characters. For instance, if you have a list of contacts on your screen, and want to move down to a name that begins with the letter L, you would trace an L on your screen. Looking for a song on your playlist, you might handwrite on your screen the first couple of letters from the song title.
The paper is Gesture Search: A Tool for Fast Mobile Data Access (pdf), and it tells us that Gesture Search is presently in use by hundreds of thousands of users, with a mean rating of 4.5 out of 5 for more than 5,000 ratings.
A Google patent application published October 14th, Glyph Entry on Computing Device, provides a number of details that the paper doesn’t, including a look at how this gesture navigation system might be used to navigate maps from Google Maps.
The Google Gesture Search page is presently part of Google Labs, and it can be used to search for contacts, bookmarks, applications, and music tracks.
How might it be used with Google Maps?
When you are viewing a map from Google Maps, in addition to downloading the map tiles, you may also be downloading meta data associated with the map, such as the names of streets, landmarks, towns, businesses, and other points of interest located within the boundaries of the map.
Imagine driving through a town, and showing a map of your location on your phone. Trying to find Main Street on the map, you trace the letter M on your screen.
The patent tells us that other places that begin with the letter M might appear on a list that you can select from, such as a number of McDowell’s restaurants (as signified by their Golden Arcs), or other places such as Maple Street, Marple Avenue, and the Metrodome.
The items that show up in the list might appear in an order based upon a combination of prominence and distance to the user.
Prominence, in this case couild mean that street names might be listed first when it appears that a user is moving at a high speed, or driving. Business names might be listed first when it appears that the user is walking, and might be looking for a particular business or store. Landmarks that are closer might be shown before landmarks that are further away.
This could be a helpful and interesting addition to Google Maps that could make it even easier to use, especially on a phone.
Keep an eye out. Gesture Search on Google Maps may be coming to a phone near you soon.
19 thoughts on “Google Gesture Search for Android on Google Maps?”
This is insane, everything is developing so fast. I cant stand it too much speed. By the way this new feature is cool.
This is a potentially awesome feature – usability at its purest form. I wonder how severely, though, this additional data download will slow the whole process down. This is not trivial.
Sounds great. Can’t wait until this is included in a new version of Android. Gingerbread maybe?
Sorry, Google. I want the Johnny Mnemonic interface, where I open up a virtual 3-D information gateway that allows me to search for and interact with what I’m looking for in real-time.
This is a cool feature, but I’m wondering how usefull it is on Google Maps or any GPS function. After all its only a few seconds of time you save, but as a gadget its nice
This new feature simply rocks, I love it!
The technology idea is closer to simplicity like Holywood sci-fi..cool article
I didn’t know this, thanks for the info. This is a really cool shortcut.
I expect things will even start moving faster in the future, especially when it comes to the mobile web.
It is cool.
I was wondering if Google could find a way for us to search that metadata from our desktops. I think it would be useful there as well.
I’d imagine that the meta data associated with an urban map tile might be considerably larger than one in a suburban or rural setting as well, and yet that might be the kind of area where it could be most useful.
That download size might be one of the reasons that we don’t presently see that integration between gesture search and Google Maps yet, though I would guess there are other potential technical hurdles in getting it to work well.
Some of the things I’ve seen in patent from Google have already been released and used for a few years. Some of the are offered by Google within a few days of a ending patent being published to the public, or the granting of a patent. Other weren’t offered until months or years later, and I’ve seen others that don’t appear to have every been offered.
I’ll second the Johnny Mnemonic interface interface.
Wondering if Google every tried to hire William Gibson…
This seems WAAY too confusing. Is it possible to remember all these gestures???
Fortunately, most of the gestures used in this process are ones that you’ve been learning since possibly before first grade – letters and numbers.
Looks like a cool new feature. Much needed, search on Google maps with my Desire don’t work so well. The biggest problem is searching while driving. I need some function so I can search with one hand or with voice control. Two-hand control while driving is kind of unsafe. 🙂
One handed search while driving really isn’t all that advisable either. Better to let a passenger search, or to pull over. 🙂
Google Maps navigator (http://www.google.com/mobile/navigation/) does let you search by voice, too.
This is great – Google (and app developers) should roll this out across all applications. It should appear in place of a keyboard if a user selects gesture input.
Really Awesome features and it is uses with the google map is more cool than this. Is the you have any another such kind of features……?
It is an awesome feature, though I’m hoping that touchscreens don’t start invading the desktop and replacing mice.
I think there are some other unique features in the works from Google for mobile devices in their patents, but I haven’t been following those as closely as the search related ones.
Comments are closed.