Google’s patents have provided a great number of hints over the past 10 years about local search and how Google treats businesses and landmarks in Maps and Web results and elsewhere. I’ve been fortunate enough to have uncovered some of these patents and written about many of the algorithms and approaches that Google has used, including concepts like location prominence, location sensitivity, Maps in Universal Search, Google’s Crowdsensus Algorithm, and more.
I am going to be the keynote speaker at Local U Advanced, Baltimore, starting Friday night, March 8 from 7:00 pm through Saturday at 5:00 pm on March 9 (There’s an early bird discount of $100 if you sign up before Feb. 8th). This Local University presentation will be taking place in Hunt Valley, MD. There’s an amazing group of speakers lined up for the event, covering local, mobile, and social aspects of local search.
Google’s been pretty active when it comes to filing patents and publishing papers about the intersection between mobile devices, local search, social signals, and the evolution of a search powered in part by a knowledge base. Predictive algorithms powering things like Google Now and parameterless searches, indoor mapping and wearable computers in Google’s Project Glass, and local/mobile/social apps and features will be powering the future of Google Maps, and Google’s efforts to tie together information from many sources as it maps and shares the world around us.
The different algorithms and methods that Google uses to map the Web are echoed by the methods they use to map the world, though the world can be challenging in its ways. On the plus side, land doesn’t move (at least thankfully not that frequently). On the minus side, instead of things like robots.txt files telling search engine crawling programs where they can and can’t crawl, street view cars have to be more careful of signs saying things like “Military Base,” or “Private Road,” and water features can bring a crawl to a standstill.
Speaking of local search-related patents, sometimes Google introduces something completely new when a patent application is published, or a patent is granted. That’s been happening a lot lately with the augmented reality heads-up displays that Google has been working on with Project Glass, which will be the focus of Project Glass hackathons at the end of January and beginning of February.
Sometimes Google is granted a patent that gives us a look back at something that we’ve been seeing for years, and it can fill in some gaps for us, give us some new terms to identify old behaviors that we recognize, and confirm some of the things that we’ve been observing. I like seeing those patents because they provide some of the backstories behind things we’ve seen before. For example:
We’ve all probably seen a set of search results from Google with maps results included within them on a search that doesn’t include any geographic information within the query itself. This is the first time I can recall seeing that in a screenshot from a Google patent, though.
As concisely as possible, the patent describes how Google might perform a search based on a query, and checks a whitelist of terms while searching, to tell whether or not it should perform a second search that’s more geographically aware, and understand the locations of businesses related to a query, and the location of the person performing the search.
The patent granted today is:
System and method for displaying both localized search results and internet search results
Invented by David D. Shin
Assigned to Google
US Patent 8,359,300
Granted January 22, 2013
Filed: April 3, 2007
Abstract
A method of presenting search results includes sending to a server a search query, wherein the search query does not include any term that identifies a geographic location, and receiving a set of search results corresponding to a search query. The search results include the first results and second results. The first results match the search query. Each first result corresponds to one or more locations associated with a respective geographic location and includes links to additional information about the one or more locations.
The respective geographic location is associated with a client or a user of the client. The second results correspond to Internet-accessible documents that satisfy the search query and include links to the Internet-accessible documents that satisfy the search query. The method further includes presenting the first results and second results in a single web browser window.
What we aren’t told about the whitelist of queries is how Google generated them, and how Google decided that they might signify some kind of local intent, without a searcher including location information within the query itself. It’s obvious that when someone searches for “Pizza” that they are likely looking for somewhere nearby to order something to eat. It might not be so obvious for other query terms.
It’s likely at this point that query terms and phrases that might be in such a whitelist are probably generated in part according to a statistical model that might involve experimentation on the part of the search engine to see if people select local results when they are presented with them within search results. The patent itself doesn’t tell us how those white lists of terms that might have a local intent come from.
We get a sense of the kind of predictive geographical model that Google might use to determine local intent for specific queries that don’t include location based data within them in places like Google’s patents that determine which data centers might be the best ones to route queries to:
- How Google Data Centers may be Split between Regional and Global Data
- How Google Might Classify Queries Differently at Different Data Centers
I’ll be looking at this patent filing and other recently granted and published documents from the patent office for other hints about Google’s local search, and sharing some of those at my Local U presentation in Baltimore. If you can make it, I’ll look forward to seeing you there.
I’m looking forward to visiting one of my favorite museums while I’m up that way, the American Visionary Art Museum:
Might Google also choose the whitelist by compiling a list of keywords within search queries which are most often followed or proceeded by a location? For example, they know that when the word “pizza” is contained within a search query it is more often than not followed or proceeded by location thus “pizza” is added to the local intent whitelist.
Hi Justin,
That would definitely be one way of creating a statistical geographic model that could be used to generate a whitelist for different locations. 🙂
This is really interesting. One point that is particularly intriguing is:
“In some other embodiments, the one or more lists of terms includes a second set of terms (sometimes herein called a Short Whitelist) associated with terms that also signify local intent. As described below, the Short Whitelist is searched when geographic location information associated with either the user or client is not found when the server first receives the search query. These lists are utilized to allow a user to enter a search term, desiring a localized search result, without having also enter any location within the search query.”
I suppose this is the kind of search result when you get when you search for, for instance, “electrician” in an incognito search mode. You should get a small blank box on top that would say like “Looking for local results for electrician?” and would ask you for the “US city or zip” (apparently works only for the US right now).
Regarding how the local intent might be determined, I believe what Justin said could be exactly how it is done. Here is an article that quotes a former Yahoo exec saying: “We were able to “isolate†those search terms that signaled local intent, because they were most often searched along with a location, e.g. “pasadena dentistâ€, “mechanic in san jose, caâ€, “attorney 90210″. This created a large corpus of search terms that signaled local intent like mechanic, attorney, sushi, burrito, etc.” Further on he says something, which is also really interesting and you’ve mentioned on a few occasions in your previous local-related articles: “We could also tell “local intent range†of the search query; e.g. the query “used cars†was most often searched using locations that had distances up to 50 miles, (used cars los angeles), while other local search terms had local intent ranges far smaller or at the neighborhood level such as “dry cleaners 92010″. So if a user searched for dry cleaners, or used cars, it would follow that you’d want to serve local ads that were up to 5 miles away for dry cleaners and up to 50 miles away for used cars.” I would assume Google might be using some very similar technology.
Whoops, here is the article.
Hi Nyagoslav,
Thanks, those are all great points.
Yes, there’s a screenshot of the kind of search box you’re talking about in the patent images, that works exactly how you describe.
Building the whitelist, or a statistical geographic model, can involve a few different processes, such as mining query sessions in Google’s query logs to see if certain query terms tend to include geographic related content such as zip codes or city or other place names. They can also involve click logs, and Google testing to see if one box type results like map results tend to get clicked upon when they are included within search results. See the second part of this post:
https://www.seobythesea.com/2011/10/gps-to-correct-google-maps-and-driving-directions-as-a-local-search-ranking-factor/
It’s interesting to try out different queries, and change your location to different places while you do to see whether maps results, or even localized organic results appear or not. Sometimes some queries will trigger either in some locations, and not others.
The concept of location sensitivity, and what scale of map that Google might show for different queries is really interesting, too. In the page linked to at the start of this post about “location sensitivity,” there’s a section in that post titled “Factors in Location Sensitivity” which describes some of the features that might play a role in that. You can look at other things as well if you’re a search engine, like how many people clicked on links for driving directions for specific locations, and how far away those places might be, and with mobile devices you can now even tell if someone visited one of those places.
The Screenwork article is interesting. That’s an area that Yahoo has been investigating for a while (and Google more than likely has, too.) See: How a Search Engine Might Determine Whether a Search Involves a Geographical Intent
What about using pre-existing lists of local service/institution categories? i.e. whitepages, yelp, etc. They put decades of research into their categories to encompass all local services people could be interested in.
Couple that with some semantic synonyms and you have a nice whitelist base of local queries.
The methods mentioned by Justin and Nyagoslav are definitely more advanced and would fill in the gaps. With the amount of searches on G, they can definitely data mine a list of more terms.
Cheers,
Oleg
I’ve got to read and reread this several times. In the meantime, I’ve got to tell you, I like the pictures. 😉
So interesting that it was filed back in 2007. They are just now releasing it. Bit of a delay…
This is really tasty stuff, Bill! I can’t wait to discuss Google local search patents with you in person.
Thanks, Mary.
I’m really looking forward to the Local U Advanced sessions in Baltimore. It will be nice to be surrounded by people who are so interested in local search, and have the chance to talk shop.
Bill,
Have always loved reading your insight on patents and how Google might use them so the opportunity to hear you speak in my hometown is rather exciting! Look forward to seeing you in March.
Hi Oleg,
Thanks. It definitely doesn’t hurt to have some seed sites to uncover some possible choices.
Google does use sources like the whitepages and yelp to determine some level of location prominence for businesses that might be listed in Google Maps. Chances are that they may play some role in determining whether queries have some amount of local intent, but I would guess that Google looks at a lot more than just sources like that to determine whether or not searchers might have a local intent when using specific queries. Things like click logs and query logs are actually collecting that kind of information from searchers themselves.
Hi Dave,
Yes on the pictures – you know how much I like that area of Baltimore, and the museum over there.
Some patents get granted really quickly, and this wasn’t one of those. But Google has been integrating maps results into web results for quite a while as well.
Hi Rich,
Thanks. I lived north of Baltimore for a number of years, and now I’m south of it. It’s nice to actually turn off Rt. 95, and visit. I’m looking forward to it. 🙂
Thanks for the reply and the additional resources, Bill. I have actually read (a few times) pretty much every article you’ve written in the past few years on local 🙂
Hi Nyagoslav,
You’re welcome, and thank you. 🙂
Very interesting info. It’s funny to think about the fact that I used MapQuest just a few short years ago. With how many local searches there are it’s not surprising that Google is taking aggressive steps to hold onto the market, especially considering that it’s the one area (in my opinion) that Facebook could actually tear away market share from them.
Sometimes this technology goes a little wrong and Google deems a query local when it isn’t. This worries me. But to patent it; I think is genius. I don’t understand patents but would love to understand them and create one that is forever in need, wishful thinking. Good luck with your public speaking.
P.S. I used to frequent here but got a bit off the beaten track.
Really interested to see how location based tagging is going to influence the news and broadcasting space in the future. I think Google is setting itself up for being much more than just search. Looking for ‘pizza’ near you is one thing but I think these patents have the potential to enable much more.
Interesting read Bill. Local search is revolutionizing and it’s definitely going to be interesting to witness how Google continues to evolve it throughout this next year.
Very interesting read as always Bill. Without echoing Daniels comments I’m very excited to see how they build on this in the next couple of years. That is providing they don’t have to wait 5 years for each patent to be granted 🙂
Hi Bill,
I am wondering how Google will display local map and image on right side for brand name search. Off late started seeing this particular change in Google search.
We have been listed on Google local map successfully and also have added Schema local business markup.
I hope this much is sufficient to get local markup for brand name search.
How many days Google will take to display map if any one searches our brand name?
Hi Deepak,
People use the phrase “brand name search” as if it means something, but the search engines don’t have a “brand name search.”
Google might show an image, a map, and information about a business as a knowledge panel result, but those don’t always show up for businesses.
As long as it takes them. 🙂
If you do as much as you can to help Google make the association between your website and your business name as an “authoritative” one, that’s a start. But I don’t work for Google, and they will take as long as they want to. Patience is important if you’re going to do SEO, because there are specific periods of time that are well known and defined.
Thank you Bill for all the information. I do will wait for Google to show our brand name search with local results. I am very much interested in Local Optimization and every day will be doing small research on the same and experimenting with the websites I promote.
Initially I thought, Google will provide the local business result as fast as it does show up for Author/Publisher Markup 🙂
Hi Deepak,
You’re welcome.
Google isn’t necessarily always going to associate a business with a query for the name of the business, especially when there’s more than one business with that name. Author/publisher markup doesn’t guarantee that when you do a logged out search on Google, that if you search for your name, you’ll see a knowledge base results, or you’ll see an authorship badge next to content that you’ve created. There seems to be some threshold of some type that you have to surpass before Google will show an authorship badge – fortunate that seems to be lower than it was before, but I saw a lot of people waiting after they set up authorship to start seeing profiles next to their posts in search results.
Hi Bill,
You are right and personally I have experienced the same for most of the client blogs we handle. There is no time frame to get rich-snippet results on Google, some times a week, some times more than that.
Hi Deepak,
Definitely setting yourself up to succeed in terms of using schema and meta data to create the possibility of rich snippets is half the battle. We can’t always predict how quickly Google might implement rich snippets, but they usually won’t appear (in most cases) without creating the possibility of them appearing.