Google Maps helps people navigate from place to place.
In order for it to work effectively, it’s helpful if it can track the location of the device that someone may be using to help them navigate.
It’s interesting how Google tracks your location history. I’ve noticed that after I take a photo near a business, Google will sometimes ask if I would like to upload that photo to the business listing for that business. Sometimes the photos aren’t relevant to the business I’ve taken them near, such as a photo of an Agave Plant that I took near a Seaside Market in Cardiff-by-the-Sea, California.
Google seems to like the idea of saving location history for people who might search for different types of businesses, and a recent patent that I wrote about described how Google might start using distances from a location history as a ranking signal (as opposed to a static distance from a desktop computer.) I wrote about that in Google to Use Distance from Mobile Location History for Ranking in Local Search.
A couple of days ago, Gianluca Fiorelli published a thoughtful look at the Search Industry in past year, and the year to come at Moz titled SEO and Digital Trends in 2017. He included a graphic within that which listed things that he considered important events in the industry, including Google patents that had been granted in 2016. He listed patents that I had written about in that graphic, but hadn’t linked to them in the post, so I considered doing so, and mentioned in the comments that I likely would. I also wrote a number of posts on the Go Fish Digital Blog, and decided that I would link to some of the Google patents Which I wrote about that were granted in 2016 from there as well.
Here are the Google patents granted in 2016 that I thought were interesting enough to write about this year, and something about what those Google patents do:
Search Results (SERPS) are no longer about showing pages that are ordered by rankings for a query term. A Google paper shows us a different way of thinking about them in our age of structured Snippets and featured snippets mixed with URL search results, with a Search Results Evaluation Model. The paper is:
Search engine results have gone through some significant changes over the past couple of years. A paper from the CIKMâ€™16 conference in October 24-28, 2016, recently published on the Research at Google pages describes some of the user behavior that may take place around search results. The benefit that the paper brings us is that it describes:
In this paper we propose a model of user behavior on a SERP that jointly captures click behavior, user attention and satisfaction, the CAS model, and demonstrate that it gives more accurate predictions of user actions and self-reported satisfaction than existing models based on clicks alone.
Google Glasses aren’t the only Heads up Display that Google will likely use or demonstrate. Imagine that Google acquired technology that let you use your gaze as a mouse, and tracked your eye movements to see what you are looking at. Google has acquired such technology.
I looked up the granted patents and pending patent applications from Eyeinfluence. Some of these have the same name and are possibly continuation patents (with different claims). I’m seeing differences in the claims that are worth comparing to see how the technology behind them has been updated. With Google looking at Virtual Reality applications and likely more Augmented Reality applications, it’s good seeing them investing in other related technologies.
A couple of Augusts ago, I went to a Semantic Business and Technology conference where the head of Yahoo’s Knowledge Graph, Nicolas Torzec, discussed how updates took place to the knowledge graph when some earth-shaking event took place. He told us that they were manually editing information in that knowledge graph. Upon hearing that, I thought it seemed like an area that could have used a machine learning element to it, to automate it to keep it up to date.
Another place that would benefit from machine learning would be generating featured snippets that answer questions people might ask at Google, and it appears that they thought it might be useful there, too. A Wired Magazine article from Monday describes how the sentence compression algorithms behind these featured snippets might be used:
This summer, Google was granted a patent that describes how the search engine might rank events based upon data that might indicate the popularity of those events, without relying on things such as the number of links pointed to pages about those events. The patent involves ranking events that occur in physical locations.
Examples of the kinds of events talked about in this patent include such things as music concerts, art exhibits, and athletic contests, all happening for specified periods of times at specified physical locations, such as concert halls, galleries, stadiums, or museums.
Since many events in a geographic region can happen at the same time or at overlapping times, interested individuals may at times find it difficult to determine which events to attend. For example, individuals may be unaware that events of interest are scheduled to occur or may have difficulty identifying the most interesting events when multiple events are occurring.
This ranking events patent lays out a general process flow to describe how the method in the patent works. It starts with receiving data about a physical location, and events taking place there during a certain time period and computing signal scores for those events based upon things such as a mention of the event and a popularity score for the event based upon those signal scores.