How Google May Rate Raters

Sharing is caring!

In my last post, I wrote about how Google may be incorporating Sentiment Analysis into the snippets that they showed for some search results. Another new feature that was announced at Google’s Searchology was the display of user ratings for products on some pages. We were told that these reviews could be found in “rich snippets,” which show up under the title to a page in a search result and above the snippet or description for a page.

A recent patent application from Google explores the topic of ratings, assigning quality scores to raters, and discounting or eliminating ratings for dishonest or malicious raters. It made sense to look a little more closely at the ratings that now appear in “rich snippets” and spend some time with the patent filing to see if it might impact how ratings might be shown in the future.

In a search for [new york seafood restaurants], I found one result from Yelp that showed an overall ranking, many reviews, and an indication of how expensive the restaurant listed might be:

a search result from Google showing 4 out of 5 stars from 10 reviews below the title for the page.

One factor determining whether or not rankings will show up for a particular search result is whether or not the site listed in the result has used microformats or RDFa standards for their reviews. On one of their help pages, Google provides examples on the formatting of reviews and tells us that:

When review information is marked up in the body of a web page, Google can identify it and may make it available in search results pages. Review information such as ratings and descriptions can help users to identify better pages with good content.

One of the keywords in that passage is the word “may.” There’s no guarantee that Google will show reviews for all pages that are marked up correctly.

Google also has a page where organizations that may be interested in showing smart snippets can contact them. In the form on that page, we may be seeing a hint of other areas where Google may show ratings.

Part of a contact form from Google asking about structured data that might be available on the contacting person's site.

So, in addition to reviews, Google may be considering showing smart snippets that contain information about people profiles from places like social networks, local business information, products for sale, and possibly other kinds of information.

All Ratings Considered Equal?

When Google shows an aggregate rating on a 1-5 scale for a site, does it count each review equally? If it does now, will it do so in the future?

Google’s patent filing explores ratings for products and services and describes how much weight they might give to certain ratings based upon who may be doing the rating.

The patent filing is:

Rating Raters
Invented by Anurag Adarsh, Apurv Gupta, and Vihari Komaragiri
Assigned to Google
US Patent Application 20090144272
Published June 4, 2009
Filed December 2, 2008

Abstract

A computer-implemented method includes identifying a plurality of ratings on a plurality of items, wherein the plurality of ratings are made by a first user, determining one or more differences between the plurality of ratings, and ratings by other users associated with the items, and generating a quality score for the first user using the one or more differences.

The patent filing describes how Google might distinguish between honest raters and dishonest raters by looking at how similar rater’s ratings are to others who have rated the same or similar items. It discusses how the ratings of people who always rate positively (optimists) and the ratings of people who always rate negatively (pessimists) might have their ratings adjusted.

Of course, a major concern in providing ratings involves people who might attempt to “game” ratings providing positive reviews for one product to increase its rating while possibly also providing negative reviews to decrease competitors’ scores.

Search Result Rankings Based Upon Implict Ratings

While much of the patent filing discusses explicit ratings, where a rater chooses a score for a product or service or comment or web page, there are a few passages in the document about “implicit” ratings, such as the following:

Users may also implicitly rate an item, such as by viewing an online video without skipping to another video.

This particular part of the document seems to imply that web pages could also be ranked in search results based upon rankings from raters:

In yet another implementation, a computer-implemented system is disclosed that includes a memory storing ratings by a plurality of network-connected users of a plurality of items, means for generating rater quality scores for registered users who have rated one or more of the plurality of items, and a search engine programmed to rank search results using the rater quality scores. The items can comprise web-accessible documents having discrete, bound rankings from the network-connected users.

Conclusion

The patent application does go into detail on how quality scores might be computed for raters of items and how it might eliminate ratings from some people who provide those ratings. Ratings might be eliminated based upon the history of rating items from a particular rater, how quickly ratings might appear from certain raters, and many other reasons.

Google may pay more attention to ratings in the future than just providing a number and average for ratings for items or services or people because the owners of the sites have used the correct formatting to show off their ratings. The processes in this patent filing may be followed in determining good ratings and bad ones and determining which ratings we see in the future.

Sharing is caring!

36 thoughts on “How Google May Rate Raters”

  1. I can see how they’d want to do this. Just as Yelp is allowing its community to distinguish itself and just as there are bloggers out there that are the most popular, so too will raters emerge as an important voice to heed when assessing content. Smooth move on their part.

    Also, do you the checkbox to indicate that there are ratings on your site is a CYA on Google’s part? Bing is getting a lot of heat for doing video previews because it borders on copyright infringement (i.e., if you see all you want on the preview, have you replaced the need to view the content in its original destination?). Is Google creating both an indicator and an opt-in out of this checkbox for ratings?

  2. Wow, this looks like a pretty big change. Is this the beginning of the end for SEO ? I don’t want to overreact, but in 2-3 years, if they keep exploring this ideas, will seo become obsolete ?

  3. Pingback: » Pleiten, Pech, Poken & Pannen | seoFM - der erste deutsche PodCast für SEOs und Online-Marketer
  4. But I can go to my local library or a friends house & rate there just as easily as my own IP address, so this system is not a true indicator of individual ratings, in my opinion

  5. It would be good if they could control fake reviews but I agree with Susan in that it would be difficult to prove.

  6. So in the same way that sites develop footprints and have indicators to say trusted or not so do ‘raters’. And as their ‘trust’ is established so is the value given to their opinion (review) positive, negative or indifferent.

    For me this is such an increasingly important part of how sites are going to be perceived. I wonder if they may go down the whole Yahoo! ‘Trust rank’ and ‘seed’ 200 trusted reviwers and work out from there?

  7. If they can actually make this reflect real reviews, this could be great for business! It will all depend on whether they control those “fake” reviews, and reviews from people that are never satisfied with anything, no matter how good it is…

  8. Yep, it would be hard to tell fake reviews especially from competitors. Some review site do ask the order number of your purchase and based on that they would see the consistency compared to the other reviewers.

  9. Hi jlbraaten,

    It’s interesting. The patent application does refer to something called “RaterRank” at one point, when discussing the possible use of reviews of pages to influence their placement in search results, so the processes involved in this patent might have an interesting impact in the future. I think it’s interesting that Google might look more closely at the people who are providing ratings, to try to understand if the intent behind those ratings are honest and objective evaluations, or if they were motivated by a desire to promote certain products and services and possibly provide negative input regarding competitors to those products and services.

    Yelp seems to have jumped in first in working with Google’s use of microformats for ratings. I think Google is taking this slowly, and the checkbox and application it comes from is a chance for Google to let others apply to have their ratings show up in a controlled fashion. If Google is going to show ratings from other sites, they may have to do some work to understand the structures of those other sites, and how best to capture review data from them. I don’t think that it is a CYA approach as much as a reasoned and deliberate (read slow) approach.

  10. Hi Jack,

    I don’t see an end to SEO as much as an evolution. I think that’s constantly true – that changes to the way that search engines operate keep people who provide SEO services to stay up-to-date with those changes as much as possible. There’s the possibility that rating the raters of products and services and pages may end up influencing what we see in the future in search results, and the inclusion of ratings in search results may influence what pages people looking at those results may choose when they want to see ratings. The evolution of search is interesting – this path may bring about some changes worth paying very much attention to, or it may end up being just another of many changes that we will see in the future of search. How much impact it will have is hard to gauge at this point.

  11. Hi Susan,

    Thank you for raising that point. The process described in the patent filing does go in a different direction than looking at the IP address of people who provide ratings. It’s possible that since Google is looking at raters who have specific profiles, and a history of providing ratings, on web sites that Google has no administrative control over, that Google may not even have information about the IP addresses associated with those raters. We are told about some other processes that may be used to rate raters – such as looking to see if there are raters that tend to rate the same products or services or pages as if they were either controlled by one person (presumably using sock puppets), or as part of some kind of voting group. It’s possible that kind of activity might cause the ratings of a rater, or a number of raters, to be considered as less than honest evaulations of those products or services or pages – regardless of the IP addresses of the raters.

  12. Hi Neil,

    Framing the idea of rating raters in terms of trust is a good approach to follow. The patent filing describes some of the approaches that Google could use to try to determine how much they trust reviews/ratings and the raters who provide them. I thought it was interesting that they mention towards the end of the patent of the approach that Amazon uses to allow raters to vote whether or not they found a rating/review helpful, and mentioned that they may find ways to allow raters to provide some indications of trust in a similar manner.

    I would guess that it’s likely that at the very least, Google has probably experimented with an approach like Yahoo’s Trust Rank on their way to developing their own approach. It would make sense for them to explore options like that.

  13. Hi Jeff,

    I agree with you. People have been relying for years on word of mouth reviews and testimonials for local services and businesses from neighbors and friends and local newspapers and magazines – open honest appraisals of where someone can get the best meals, or good car repair services, or the cheapest furniture, etc.

    The patent application does provide some insights into how they might identify dishonest reviews and how they might handle reviews from people who tend to see things from a very pessimistic or a very optimistic stance. What we are told in the patent is probably just some examples of approaches that they might take, rather than a detailed list of every step, but it’s good to see that they did account for those possibilities. We’ll have to watch and see where they go from here.

  14. Hi anne,

    I think there are some good ideas in the patent filing that go beyond Susan’s concerns about people logging in at different locations and proving reviews. It’s possible that if there aren’t very many ratings from specific raters, that those may not be considered until there are enough to see if there are certain patterns in the way that they provide ratings. Once there are enough ratings, it’s possible that those patterns may be helpful in decidiing whether or not to trust certain raters.

  15. Hi Diamonds,

    Good point. Tying a purchase to a review would go a ways towards authenticating the identity of a rater. But that might limit the number of ratings that a site would receive.

    Many sites that allow people to rate products or services require that you create an account and then your ratings are all tied together. That’s true with Yelp, with Amazon, and many other sites where people can provide ratings.

    Someone creating an account to provide a negative rating for a single competitor might not have their rating included in an overall rating. The creation of an account to provide negative ratings for handful of competitors may stand out if those ratings were looked at together, and those may be ignored as well.

  16. Wow, that is pretty amazing. I think that testimonials are great, and I think it’s good to allow everyone to post reviews, however I think certain users that are more active and trusted should be displayed more prominently – so that there is less sabotage from business competition, etc…

  17. Hi Joel,

    Thanks. It is interesting to see Google placing more emphasis on ratings by including them in snippets. Every site that allows people to post reviews have their own processes in place to try to determine trust. For instance, Amazon.com allows people to vote on whether or not they found reviews from others useful or not.

    The process in this patent filing is supposed to help identify potential sabotage, and other possible problems. It’s goind to be intersting to see how Google’s presentation of ratings evolves.

  18. This is all enlightening, but at what point do you know your search query results are skewed due to bookmarking and rating websites. When you’re logged in to your Google/gmail account?

  19. Hi San Diego Web Designer,

    Google will show sentiment analysis information in snippets for pages when you use the “Show options” tab at the top of results, and then click on the “review” tab. That’s regardless of whether you are logged into Google or not.

    It’s possible that we may see sentiment results in other places as well, in the future.

  20. Bill, another well written article as always. We have done some work on getting the ratings to show in Google’s snippets. here is a great link to find out more on how to make it possible to help Google understand that you have ratings on your pages. For those who dont want to look its quite easy really. You need to tag the ratings by microformatting. We have and it works. All is well with the world.

    http://www.google.com/webmasters/tools/richsnippets

  21. Hi Lee,

    Thank you. The rich snippets that Google introduced earlier this year in their Searchology presentation are interesting. I think we’ll see even more interesting things in snippets as more people start using microformats and HTML 5, and the search engines explore the possibilities that those have to offer.

  22. I can see the need to “rate” raters. It’s a good way for Google to maintain value in the system. By rating raters, it gives incentive for people to be truthful. Many studies have been done and people do not like negativity to reflect on their online profiles.

  23. Hi Kelly,

    A very good point. I also like the system that Amazon reviews use, where people can vote on whether or not they find a review helpful. I think this second level to reviews not only makes it more likely that someone will provide a truthful review, but it also results in better quality reviews. I think that kind of second level review (a review of a review) could also be something useful for the kind or RaterRank that this patent filing suggests.

  24. Pingback: What I’ve Been Reading: April 2010 | Alex Minchin
  25. As people have stated sometimes I find it hard to discriminate between fake and genuine product reviews, particularly where there are no obvious indicators. I am pro free speech on the internet where it doesn’t cross the line and become proscribed speech. I have read sites named “ihateX.com” and “www.XYZscamreview.com” and there appears to be very valuable, informative and valuable discussion occurring. However until it occurred to me that the rater may be hijacking the traffic of a brand using their brand in the anchor text, I didn’t realise that the reviewer might be engaged in monetisation. Even realising they were, it was difficult for me to dismiss the fact that there appeared to be some very informative content which provided real value to readers relating to a subject matter of interest. As for gripe sites, or suck sites in domain names, I do believe that there are people who genuinely use the internet to air their grievances or publish matters of public interest. I have read domain name decisions in which panels make an automatic assumption that the poster has registered and is using the name in bad faith, although the National Arbitration Forum seem to lean more in favour of protecting free speech. I dislike fake or paid product reviews (cash for comment) but I would hope it would take more than a domain name to be the arbiter of whether someone was credible. I would imagine if i saw a product review site with lots of affiliate ads and tell tale signs I might be able to discriminate. Does the patent give any weighting to words incorporated in the domain name?

  26. I wonder why not just use the actual access to a page through Google as a way to measure popularity — in a way, a Click Through Rate measure — number of actual clicks on the proposed page (link) by google on screen divided by the number of times it was shown

  27. Apparently, if a user actually clicks the link of a search result, then it means that there’s some sort of agreement with it. In cases where google can “see” (cookie of a google domain or of an affiliate) the page view timing or other clue of user satisfaction from the actual “landing” on the page, then it is an additional indication of satisfaction. So, perhaps raters are not so good and they are biased and not representative?!

  28. Hi Adele,

    The patent really doesn’t address issues such as the authority or authenticity of the site a review appears upon, or related signals such as the domain name.

  29. Hi Shlomo,

    It’s possible that Google does look at a click through rate to measure popularity, and may also look at other factors such as how long someone may have viewed a page that’s been clicked to indicate satisfaction with that page. Google may also try to look at other factors, such as the position of the page within the search results and a prediction of how many clicks the page might receive at that position.

    One patent from Google that provides an example of how click data might be used is Method and apparatus for classifying documents based on user inputs, which attempts to look at click data to determine whether a search result might be: (1) on topic, (2) off topic, (3) Spam

  30. Pingback: How Does Google Rate Raters For Google Plus Local?

Comments are closed.