How Might Google Police for Map Spam?
When Google ranks businesses at locations in Google Maps, they turn to many sources to find mentions of the name of the business coupled with some location data. They can look at the information that a site owner might have provided when verifying their business with Google and Bing and Yahoo. They may look at sources that include business location information such as telecom directories like superpages.com or yellowpages.com. or business location databases such as Localeze. They likely also look at the website for the business itself, as well as other websites that might include the name of the business and some location data for the business, too. Sometimes they come across Map Spam when looking at those sources.
What happens when the information from those sources doesn’t match. Even worse, what happens when one of these sources includes information that might be on the spammy side? A patent granted to Google this week describes a way that Google might use to police for map spam in such places. The patent warns against titles for business entities that include terms such as “cheap hotels,” “discounts,” Dr. ABC–555 777 8888.” It also might identify map spam in categories for businesses that might include things such as “City X,” “sale,” “City A B C D,” “Hotel X in City Y,” and “Luxury Hotel in City Y.”
In the context of a business entity, information that skews the identity of or does not accurately represent the business entity or both is considered spam.
For example, if the business entity is a manufacturer of widgets, and information describing the manufacturer contains the text item “best manufacturer of widgets,” then the word “best” may convey more than merely the business category of the entity, and consequently may skew the entity’s identity.
Just What is Map Spam, Exactly?
Whereas “a manufacturer of widgets” would objectively describe the business entity, the addition of the word “best” subjectively describes the business entity and possibly inaccurately represents the business entity. Such information is described as spam.
Repeating the same word multiple times might also be a sign of map spam. Including words and letters in number fields, like those for phone numbers could be also be construed as map spam.
The patent tells us that Google could take several steps if it identifies this kind of map spam. It could ignore it in some cases, or it could use it to negatively impact the rankings of businesses. If a business has a location that is important to their business – their customers can visit their office or location in person to conduct business, or they operate in specific regions, a listing in Google Maps can be helpful. Considering that for some Web search queries, Google Maps might be integrated into those results, and being ranked well in Google Maps could be beneficial.
The patent is:
Determining spam in information collected by a source
Invented by Anurag Adarsh and Piyush Janawadkar
Assigned to Google
US Patent 8,332,415
Granted December 11, 2012
Filed: March 16, 2011
Methods, computer-readable media, and systems for determining spam in information collected by a source are described. A frequency of occurrence of a phrase included in text items received from a source is determined. The text items are associated with business entities and do not include any spam.
Another frequency of occurrence of a phrase included in text items received from another source is determined. The text items received from the other source may or may not include spam. From the frequencies, likelihoods that a phrase is spam are determined. From the likelihoods, another likelihood that a different text item includes spam is determined.
There are a few steps involved in this analysis of whether information associated with a business entity might be map spam.
Trusted Sources v. Untrusted Sources
One of those steps involves whether or not a site that includes information about the business is a trusted source or an untrusted source. Information from trusted sources wouldn’t be considered spam, but information from untrusted sources might be map spam.
A source might be considered trusted based upon things such as the reputation of the source, or previous dealings with the source, or both. If a site isn’t considered to be a trusted source, then it’s designated as an untrusted source.
In some cases, the determination of a source being untrusted might be decided manually.
Another step has the search engine mining sites for phrases, to see how frequently those phrases appear on the pages that mention a business entity.
The business entity may have several attributes, such as title, business category, telephone number, address, URLs pointing to a website, and so on.
Values for these attributes may be collected from the different sources, including some attributes that are consistent from one source to another, such as title (name of the business), business category, and an address.
The information might be retrieved by the search engine crawling the pages it appears upon, or even electronically via an XML feed.
When information from these different attribute fields is collected at untrusted sites, the search engine might determine with a level of confidence as to whether the phrase is important, or maybe map spam.
Untrusted Sources and Frequency of Occurrence
Phrases from an untrusted source are considered by the search engine based upon how often the phrases occur in both a trusted source and an untrusted source. If a phrase doesn’t appear in the trusted source but appears a few times in an untrusted source, the likelihood that it is spam increases. If there are multiple phrases like that from the untrusted source, that also increases the amount of confidence the search engine has that a phrase is map spam.
So what impact might this have?
If a resource ranks for a query in part because of information from an untrusted source (such as “best plumber in Dallas” in a category field), the search engine might adjust the rank of the resource downward based upon the measure of spam it may have calculated was from the use of that phrase in the untrusted source.
The patent provides additional details on how phrases might be identified, possibly by the use of n-grams, and when sources that are originally determined to be untrustworthy can become trustworthy (by having the same phrases tend to occur frequently in both the trusted and untrusted sources).
Information from Users
The patent also points to the possibility that Google could also provide forms that users could use to provide information about business entities as well. Clusters of users might be created based upon similarities in things like “demographics, category of edits, language, and the like.”
Some of those clusters might be considered trusted sources or untrusted sources, and phrases in the information they provide might be reviewed like for untrusted web page resources.
If someone submits their website to several review sites, telecom directories, and regional business directories, and they include phrases such as “best widget makers,” “cheapest widget builders,” and “superior widgets” in the category fields for those sites, and those phrases don’t occur or don’t occur with any frequency, on their web site which Google may have determined to be a trusted site, Google might consider those phrases from those untrusted sites to be map spam.
If that site is determined to rank for one of those phrases because of the influence of the untrusted sites, its rank might be adjusted downward.
What this seems to suggest, if you’re a site owner, is that you should check and take control of different business listing and database sites that might include your business, and make sure that they don’t contain spammy phrases regarding your pages.
Last Updated May 18, 2019
52 thoughts on “Google Tackles Map Spam for Businesses”
Bill, fantastic insights, as always. Many thanks for posting. I wonder where Googleâ€™s own listings (Google Places) factor into question of â€œtrustedâ€ vs. â€œuntrustedâ€ sources.
More specifically: I would think that if someone creates a Google Places listing that has non-compliant words (e.g. â€œbestâ€), when that listing (we hope!) gets pulled by Google, the penalized listing itself would count as an â€œuntrustedâ€ source of information and count against any *active* listings for that business.
This is a great breakdown of an important patent in the local space Bill. Thanks for distilling this.
You would be surprised how many people submit keyword stuffed categories and descriptions to our citation building service. We always review the supplied information and recommend against this when we see it. This post is going to be great for us to refer clients to when they push back against our recommendations.
Every time Google starts tagging any element of the web as spammy, I always pick up on a ton of collateral damage from well respected sites and webmasters.
I am sure this update will also have its share of collateral damage.
It is interesting to see, however, that Google is drilling down into what I am assuming to be “localized search term” SEO being that they are eyeballing G-Maps rankings in the way you described.
My take away from this is to make each and every profile unique, which basically negates the use of any type of automation.
Can anyone give examples of other words or phrases in category fields that may be flagged as spam based on this patent?
What are the possibilities that Google use this patent for doing the same thing with SERP?
All of these new updates, rules and regulations from google are turning my brain to mush! Why must they continue to ‘shake things up’? Is it just their way of keeping everyone honest? Or do they have some bigger plan in mind?
Bill thanks for the information. There is so much to stay on top of especially in the local end of SEO. One mistake and you may find your ranking dropping and never know the reason why.
Bill, as always you do such a great service to the industry by breaking down patents in a simple layman’s fashion. I can’t tell you how many clients we come across who go as far as altering their citation sources or their own map listings even after they’ve been standardized with consistent NAP. This post is going to come in handy!
Excellent information about geographic spam, Bill! This indeed is one of the main factors that SEOs and small businesses need to watch when optimizing their websites. Great post by you!Hats-off!
“Another step has the search engine mining sites for phrases, to see how frequently those phrases appear on the pages that mention a business entity.”
This is interesting.. by this, you mean the link’s surrounding copy, within the actual citation, right?
Great article Bill. I have been following your blog for many months now and I must say this information is golden.
One thing that concerns me is that the patent itself clearly points out the fact that negative SEO can and must exist with these local businesses?
Great article. I’ve started to follow your blog a month ago or so, and from that time i’m reading every post. I have to things to say:
1. Thank you
2. Keep up the good work
And about that article and about a general thougts on Google strategies, on the one hand i’m glad that they are doing everything to make internet a better place where a content is a king (not seo), but on the other hand i’m always scared that another google trick will filter my pages.
I think your post about Google Map spam is very interesting. Honestly, it seems like spammers are attracted to every single possible venue….people are always looking for an edge.
I am glad to see that google tackles this kind of spam because sometimes business don’t even list the services the rank for. Overall, I am very happy with googles evolution focusing more and more on quality. I hope in a few years there won’t be a chance to rank well unless you truly provide great content.
Overall, this is a great way to cut down the local search spam, and get the real, legit bussinesses some exposure. Google is definitely taking its spam checking to another level, and by 2013, you’ll probably see much better Google results.
Wow that is crazy i bet a lot of marketers would love to know this information. They probably think by using these words like cheap and best that they would attract traffic when in fact they are being marked as spam
Very interesting article, your recommendation seems obvious after reading through it.
But for example I did a very basic submission for my website to several directories, to what extent they some directories can be seen by google as untrusted source? by comparing a trusted source to non trusted source? how does it works?
Sorry if this are dummy questions, I’ve recently launch my website and i’m trying to understand the google logic.
I’m a newbie too, only found your blog in the past couple of days and have spent a few hours traipsing through your archives.
Really good stuff and thorough, wish I had the time to write this sort of stuff – keep it up Bill!
Interesting Patent. I guess I knew this was coming as I was talking with a client this week about having the right citations to their business. Told them that their business name, address and phone number had to be the same in order for them to rank higher on Google but they didn’t believe that. Right now they have a bunch of issues since they move locations and lost their rankings and now their address is incorrect and they are fixing it slowly on sites they have access to.
I’m glad Google will finally cut down on Google spam so real businesses can get back to where they belong!
This is a good thing. I run into a lot of people who want to abuse Google Local/Places in order to try and cheat the search engines. It’s hard to explain to people who think they have a good idea that others have been there before, and the loophole has long been closed.
Spam gives SEO’s a bad name, so I applaud Google.
How ever did you chance upon this patent. nice analysis 🙂
Recently there has been an upward trend of people focusing on Local SEO. Citations are frequently mentioned as a way to rank high on Google Listings. Rand Fishkin also did an article about discovering citations. This piece of article seems to be hinting at what people should or rather, should not do when building citations. Am I understanding it correctly?
I read patents the way that many people read cereal boxes. 🙂
The “citation” that Rand alluded to in his white board Friday was actually co-occurrence, and his post had nothing to do with local search at all.
It doesn’t hint about what people whould do when building citations – it states that they shouldn’t spam citations.
Infact Google extends and adapts to business rules it has already implemented for the sites.
This confuses me, mostly because what Google here is calling SPAM (putting keywords in category fields) I would call adding helpful descriptions that give the user a better idea of the specific services or products a particular business offers. I understand banning words like “best” or “most trusted”, and other subjective adjectives, but Google also warns against using language that describes what the business does, instead telling us to only tell users what the business is. This is just counter intuitive to me if the over arching goal is to give the user the most complete information possible. Anyone have some insight?
If a field says business name, and instead you put something like “Plumber in Austin, Texas”, it looks like Google is treating that as spam, because it looks like you’re doing it to be more relevant for search results for a plumber in Texas, even though you are. That’s not the name of your business. While it’s descriptive, and it may be true, it’s still not the name of your business.
I totally agree with you on this Google map spam stuff.
Favorited. I’m curious as to what will be spammy vs informational.
“atlanta bankruptcy attorney” vs “bankruptcy law”. The latter is more informational and what I would consider a true categorization.
Thanks Bill for another interesting article. For some of us who are not techy, it is very helpful. The problem, each time I learn how to do something like searching, by the time I master it, something new comes along. Happy holidays.
Great explanation about the Google patent, Bill, and thanks for the heads up.
When I registered for Google Places, I noticed that there were already quite a few business competitors using Key Word phrases instead of their business name. This is good news. It will keep the playing field level.
I understand Google’s reasoning for the rules they have and for changing things around to keep spam out or at least knock it down drastically, but like with anything else, too much of something will end up hurting those that ARE trying to comply and are doing their best to keep up with Google’s new rules and regulations. When you keep adding and pushing after awhile the good gets pushed out with the bad also. I hope they are able to figure something out so that legitimate business/website owners aren’t constantly penalized for trying to keep up with their new changes.
Actually Google Maps is just another place to spam like everywhere else. Anyway, in my honest opinion, I think that Maps, thanks to Google, is a a safe and genuine place, because I dont see a lot of spam, at least here in Italy.
Interesting article, Google is doing a lot of good stuff when it comes to combating web spam. Stumbled upon this site today and bookmarked it: very refreshing content. What else to expect from someone who reads “patents the way that many people read cereal boxes” :).
Firstly, nice and interesting post.
Secondly, why would be something like: “Dr. ABCâ€“555 777 8888.â€” be bad/wrong? Why is it getting targeted as spam? Cheap, discount, coupon..yes, but why Dr. stuff?
If the field just asks for a phone number, adding anything else is being construed as spam. I don’t disagree with that at all.
Spamming even in the local directory side of SEO…who knew!?!? As a small business owner that has to use good old fashioned shoe leather to try and build an on-line presence it is both reassuring to me that Google is cracking down on such trickery but also unsettling to know that the deck is stacked against folks like me. Most people think of the web as some sort of meritocracy but my limited experience suggest it is more like “pay to play.” Thank you Bill for you continued excellent and informative posts.
I think this is a good step forward. This will tackle similar tactics used in social media spam. Different companies setting up multiple profiles with “over-optimized” text in order to rank. People did it for business citations as well, as you point out. People just need to do things the right way and stop trying to manipulate the search engines. Even if it works today, it won’t tomorrow, then all you will be doing is playing catch up.
That is great to hear! I have 5+ roof company locations on google maps and it is usually very tough to show up for the main keyword I try to with all the spammy names that are not the business name.. most competitors use things like “August Roofing | Siding Company” or ” Roof Repair Contractors ” yet their company names are far less SE friendly. I have been hesitant to do that and just recently did it on my newest location for my Augusta Roofing Contractors. Since reading this update I will definitely revise that one. I would like to see how they treat PObox address the article speaks of tackless.. but po box address do produce tacks on the map. Thanks so much for the info. Will be following more from now on. Thanks Bill!
Good Stuff! I just recently gave in and submitted a google places listing for my Augusta Roofing Company address because I have been following the proper procedures and still have been falling below the blackhat guys using SE friendly text in the name fields and so on. It took a lot for me to give in and go dirty but after reading this post it was definitely all I needed to assure me that the google Karma Gods will get them in the end and I just want to be safely on land watching them float away from the top listings
Pretty interesting insight into a very-unknown domain. This could be very helpful for those looking for a high end-SEO in the local area. As Steve has pointed out- ” I understand banning words like â€œbestâ€ or â€œmost trustedâ€, and other subjective adjectives, but Google also warns against using language that describes what the business does, instead telling us to only tell users what the business is.” Putting these simple old-age words are now really considered as SPAMs and can be pretty harmful, especially in the local SEO factor.
I have been battling spammers for the last two years with Google Local competitors and am happy to hear they are finally clamping down. My fear is that whitehat promotion business owners will end up suffering as many of us did after the Panda and Penguin updates due to the overzealous nature of the big G….
Very interesting Bill; it goes to show, Google always seems to find a way to deal with spam… so why try? It’s not worth it IMO.
Address: best SEO this side of Mars
Phone number: super duper SEO guy
Category: bestest SEO, Bestly SEO, Super-Bester SEO,
Sad to say I used to use keywords in citations back in the days when Yahoo! was relevant 🙂 Anymore, it’s just spammy and a bad idea. Google is definitely leading the charge on forcing quality everything it seems.
This article is very informative one as it tacles spams. As we all know spams are all over the different sited nowadays. Its impressive that the author had given us such information on how to deal with this. What I learned after reading this blog is that i know understand the process of breaking down patents and truted links and sources. It’s easier to understand because he delivered it in laymans term that everybody could understand.
I agree this is one of the main factors that SEOs and small businesses need to watch when optimizing their websites or they suffer.
Comments are closed.