People often search the Web for reviews of products they might buy and merchants from whom they might purchase goods and services. It’s easy to lose track of time reading reviews on sites like Amazon, where people seem to enjoy sharing their opinions about almost anything. It’s not so easy to find online reviews of merchants surrounding me in a somewhat rural community.
Reviews are interesting when it comes to how search engines might treat them, how they could possibly impact local search rankings, how a search engine might identify review spam, and the potential impact of those online reviews upon word of mouth and the reputations of businesses and the sales of goods and services.
For instance, Google Rich Snippets allow snippets to show the number of stars a place might have received in reviews from a particular resource such as Yelp:
Google Place results that appear in a Google Web search may also show an average rating and many reviews for several businesses:
Do those ratings and reviews influence in some way whether or not particular businesses show up in Google Places search results? Do they influence searchers’ choices in where to visit, after seeing the ratings and the numbers of reviews? How much influence might reviews have, and how important is it for search engines to handle those reviews intelligently?
I ran across a Google research paper which collected and studied information about product reviews, merchant reviews, and Netflix ratings for movies.
The product review data contained over 8 million ratings of 560,000 products, gathered from 230 sources, and reviewed by 3.8 million authors. The merchant reviews data included 1.5 million ratings for 17,000 merchants collected from 19 sources and written by 1.1 million authors. The Netflix reviews of 17,700 movies consisted of 100 million user ratings submitted by 480,189 authors.
The paper, presented at the Fourth International AAAI Conference on Weblogs and Social Media, is titled Star Quality: Aggregating Reviews to Rank Products and Merchants, and it details a joint Google/Carnegie Mellon University study by Mary McGlohon, Natalie Glance, and Zach Reiter that asks and attempts to answer many questions about online reviews of products and merchants such as:
- Given a set of reviews of products or merchants from a wide range of authors and several reviews websites, how can we measure the true quality of the product or merchant?
- How do we remove the bias of individual authors or sources?
- How do we compare reviews obtained from different websites, where ratings may be on different scales (1-5 stars, A/B/C, etc.)?
- How do we filter out unreliable reviews to use only the ones with “star qualityâ€
The authors have some interesting observations, such as when a reviewer has written only a single review, that author is disproportionately more likely to have given a 5 out of 5.
In looking at reviews, the authors looked at whether the reviews were submitted anonymously, how prolific reviewers might be, and whether the reviews could help determine the quality of the reviewed goods or services.
As I mentioned above, I have been surprised by how few reviews there are for merchants in my community, but I expect the numbers to grow in the future. What kind of influence might those have upon business in the area?
To a degree, that will depend upon how search engines like Google might make that review information available to anyone who tries to find it. The authors of this paper tell us that this is a subject that hasn’t had much research behind it at this point:
This is the first work, to our knowledge, over aggregated reviews from different sources. We observe that there are often biases of different sources and authors’ different authors and review communities will often have very different behavior. We compare reviews coming from these different review sites and investigate how this may help deduce the true quality of an object rated.
.
Should a review from Yelp be given as much weight by Google as one from a local newspaper? Or from another review site? How effective might the search engines be in identifying spam reviews?
Are there more authoritative sources of data that the search engines might tap into, to find helpful ratings – for instance, the paper mentions the possibility of looking at Better Business Bureau ratings to get some “insight into which merchants are most reliable?”
I think there is some relevance to the number of stars you have on the google reviews. Google wants to present the best websites and services and having a company with a high number of stars and reviews can help. I’m not a big fan of Yelp as it seems anyone, including your competitors can post a chatty comment about your company. Writing a review on google takes many steps and weeds out a lot of people making their reviews more valuable.
I think an important factor to take into consideration for a review’s value is always the reviewer’s motive.
I’m glad to see that they are looking the specific patterns of reviewers themselves and not just the reviews. That statistic is very eye opening about a reviewer with a single review most likely gave a 5/5. This lends credence to the idea that a large portion of 5 star reviews are not legitimate.
I agree. It can also be very difficult to judge.
A few months ago, I was asked to look over the site of a law firm, and one of the things that really stood out for me was that, on their Google Places page, there was one and only one review. It was an absolute rave, and it just happened to include a couple of keyword phrases that were probably highly prized by law firms in their geographic area. I checked out the profile of the reviewer, and it was his one and only review.
I put in my notes, in the most diplomatic terms I could come up with, that it was probably a bad idea to have a fake review on their Places page. They wrote back thanking me for my honesty but pointed out that the review was real, written by a real client, and without any urging from them. They knew it looked like they’d hired someone to write it, or that they’d written it themselves, but it was real.
It’s an interesting avenue that Google is heading down by showing an increasing number of reviews within their search results.
As Bob said it’s fairly easy to spot what appear to be ‘fake’ reviews, and I usually discount any review I read that are over the top with praise and contain a suspicious amount of keywords.
I’m also not always put off by a bad review, if the party being reviewed has taken the time to respond to the bad review and post their reply publicly.
I find this reassuring to know that although the company may not be perfect they are helpful and willing to resurrect any issues that may have occurred as a result of their product or service.
I too think that reviews by the reviewer will be best results in providing the number of stars for the products or merchants.
Yelp has a review filter in place which filters out overly positive and negative reviews, and reviews by nonactive members. Overall this feature has received positive feedback by Yelp members and consumers. Some business owners haven’t been happy about the policy because it may have affected the number of positive reviews displayed about their business, especially those written by nonactive Yelp members who’ve only written few reviews. I think Google is smart on not promoting just their GooglePlaces but also other business rating services, probably a must in the growing era of social search. Here’s a link to Yelp’s blog about their review filters: No longer available.
Hi Sam,
I’m not sure that showing the most positive or highly ranked review is necessarily always Google’s aim. I just posted a new post about a Google patent application published this week which says that they will try to post “representative” reviews, and might cluster together reviews by ranges of ratings for them. If those ratings tend to be negative or neutral rather than good, it’s possible that Google might show a representative rating that’s negative or neutral.
Your points about Yelp and Google reviews seem like one of the reasons why Google would spend time studying the differences between reviews. I would guess that Google might give more weight to Google reviews than Yelp reviews, because it has more data about the reviewers and may be able to better judge the credibility of people leaving those reviews.
Hi Kentaro,
I agree that motive is an important thing to consider, but unfortunately something that we might have trouble interpreting. Even if a reviewer includes their motive for leaving a review, we can’t always know how much we can trust that motive.
Hi Guillermo,
I think that kind of data about reviewers is fascinating. The tendency for people to leave a high review if they’ve only submitted one review is fascinating. I don’t know if we should trust those reviews less because of that, or interpret that tendency as many people being motivated to leave a glowing review because they found something that they thought was exceptional enough to share with others.
Hi Bob,
I think there is a tendency when we see something that seems to be too good to be true to be skeptical. I would probably have sent a similar message in that particular circumstance. Wonder if others seeing that particular review come to the same conclusion.
Hi Dave,
I think Google finds including information about reviews can be really helpful to searchers. I was a little bit fortunate that Google had a pending patent application published this week on how they might interpret the quality of reviews when deciding which ones to display on pages like Google Place pages.
One of the interesting things about that patent filing was that reviews that might have “a suspicious amount of keywords” in them might be chosen as representative reviews because they contain interesting, relevant, and someone uncommon but appropriate terms and phrases within them.
I agree about not necessarily being put off by a bad review, and seeing how a business might respond to that review as something that can actually be positive.
Hi Tessa,
I do think the number of stars showing in a Google Places listing can have a significant impact upon which results a searcher might select to view.
Hi Meri,
Interesting article from Yelp on why they try not to share too much information about their review filter. It is a different take on how Yelp decides which reviews it might display more prominently than the one that Google seems to follow. I think having a variety of approaches is a good thing.
In learning about reviews for my company. I have learned that the more reviews the better it is for me to build more clients as they see my company as reputable and highly used and in the end that is the goal. Not necessarily what Google does with it. I simply send my clients to the review sites to give me a review and that in it self is where the value lies.
Hi Michael,
Those are good points. I do think that when Google does present reviews to searchers that it provides the potential to bring more visitors to your site, or to contact you via Google’s Place pages as well. There’s some value in that, too.
Reviews are very tricky subject. We have not included review into our website for simple fact that they can be written by anyone and when it comes to review of businesses in many cases reviews are written by the owner of the business (the positive reviews) and the negative reviews written by the competitors. And unless you are able to trace the person who wrote the review and confirm its accuracy then I believe reviews are in many cases useless.
Hi John,
Reviews definitely are a tricky subject. Google does seem to be interested in collecting information about people who do write reviews, and may create reputation scores for those writers, and use other criteria to weigh and value reviews, and to find biased reviews. Most of all, people do seem to be searching for reviews, so Google is trying to respond to what searchers want.
Hi Bill, I think this goes beyond a question of reviews impacting upon Google Places listings, on to their potential impact as a ranking factor more broadly. The online retailer I work with (a purely ecommerce business, no bricks and mortar channel) are seeing an aggregated rating appear in their AdWords listings. As well as their potential to impact CTR in AdWords, I’m wondering if these ratings (and possibly the sentiment in extended reviews) might not feed into Google’s algorithm at some point?
In the case of this particular ecommerce business, of 3,148 reviews, 99% are from just 2 sites (the most significant being Google Checkout). The other 14 ratings come from 3 other UK-centric review sites. One possible scenario that would be interesting to explore is how Google might treat one website having 1,000 ratings from 20 sources, compared with another having 3,000 ratings from just 5 sources. Also, what impact the aggregated scores themselves would have in that same scenario.
It might be that a large number of ratings on one review site might positively or negatively influence others’ decisions to leave their own feedback. A campaign to incentive customers to leave feedback on third-party review sites might be best spent focused upon just one or two sources, or spread more evenly over many sources. It’s got me tied in knots, this one 🙂
If you’re aware of any Google patents that might shed some light on this I’d love to know about them!
Hi Dave,
I’ve written about a few different Google patents on reviews and ratings, but most of those focus upon things like reputation scores for reviewers or how Google might choose reviews to be the ones displayed in places like Google places. This paper is the first indepth research I’ve seen that looks at reviews from different sources to try to make sense of how best to compare them to each other, and maybe use that information in meaningful ways.
There might be a patent related to the research, but if there is, it’s likely so new that it hasn’t been published yet by the patent office. Believe me, I’ll be keeping a lookout for anything like that.
I have often seen sites that have more reviews with horrible ratings show up on top of some sites that have only a few reviews but amazing ratings. Is this because they have more reviews or because their website has better SEO? Thanks
Hi Clark,
In Web search, it’s possible that the actual reviews have little if anything to do with how those sites are ranked by the search engine, and things like relevance and popularity (of the PageRank variety) play a role.
In local search, the reviews may have more of a role in rankings, but there are a large number of other ranking considerations as well, which can include things such as distance from some centerpoint near the searcher, relevance of the business to the query, prominence of the business (with location information) on the Web, the category or categories that the site might be listed in (compared possibly to a category for the query), and more.