Google Maps helps people navigate from place to place, and has been tracking how busy places are.
In order for it to work effectively, it’s helpful if it can track the location history of the device that someone may be using to help them navigate.
It’s interesting how Google tracks your location history. I’ve noticed that after I take a photo near a business, Google will sometimes ask if I would like to upload that photo to the business listing for that business. Sometimes the photos aren’t relevant to the business I’ve taken them near, such as a photo of an Agave Plant that I took near a Seaside Market in Cardiff-by-the-Sea, California (it has nothing at all to do with the Sea Side Market.)
Google seems to like the idea of saving location history for people who might search for different types of businesses, and a recent patent that I wrote about described how Google might start using distances from a location history as a ranking signal (as opposed to a static distance from a desktop computer.) I wrote about that in Google to Use Distance from Mobile Location History for Ranking in Local Search.
A couple of days ago, Gianluca Fiorelli published a thoughtful look at the Search Industry in past year, and the year to come at Moz titled SEO and Digital Trends in 2017. He included a graphic within that which listed things that he considered important events in the industry, including Google patents that had been granted in 2016. He listed patents that I had written about in that graphic, but hadn’t linked to them in the post, so I considered doing so, and mentioned in the comments that I likely would. I also wrote a number of posts on the Go Fish Digital Blog, and decided that I would link to some of the Google patents Which I wrote about that were granted in 2016 from there as well.
Here are the Google patents granted in 2016 that I thought were interesting enough to write about this year, and something about what those Google patents do:
Search Results (SERPS) are no longer about showing pages that are ordered by rankings for a query term. A Google paper shows us a different way of thinking about them in our age of structured Snippets and featured snippets mixed with URL search results, with a Search Results Evaluation Model. The paper is:
Search engine results have gone through some significant changes over the past couple of years. A paper from the CIKM 16 conference in October 24-28, 2016, recently published on the Research at Google pages describes some of the user behavior that may take place around search results. The benefit that the paper brings us is that it describes:
In this paper we propose a model of user behavior on a SERP that jointly captures click behavior, user attention and satisfaction, the CAS model, and demonstrate that it gives more accurate predictions of user actions and self-reported satisfaction than existing models based on clicks alone.
Google Glasses aren’t the only Heads up Display that Google will likely use or demonstrate. Imagine that Google acquired eye-tracking technology that let you use your gaze as a mouse, and tracked your eye movements to see what you are looking at. Google has acquired such technology.
I looked up the granted patents and pending patent applications from Eyeinfluence. Some of these have the same name and are possibly continuation patents (with different claims). I’m seeing differences in the claims that are worth comparing to see how the technology behind them has been updated. With Google looking at Virtual Reality applications and likely more Augmented Reality applications, it’s good seeing them investing in other related technologies, such as eye tracking.
How is a knowledge graph updated when some earth-shaking event takes place? Is a search engine manually editing information in that knowledge graph? It seems like an area that could be using a machine learning element to automate it and keep it up to date.
Another place that would benefit from machine learning would be generating featured snippets that answer questions people might ask at Google, and it appears that Google thought it might be useful there, too. A Wired Magazine article from Monday describes how a sentence compression algorithm behind these featured snippets might be used:
At the heart of this approach is the crawling of a data store of news articles and other sources, with the help of a “massive team of PhD linguists it calls Pygmalion”, and the use of algorithms that are referred to as “sentence compression” algorithms that might generate answers to questions from sources such as that news sources for featured snippets.