The Most Relevant Reviews or the Highest Quality Reviews?
You may want to write a review that more people might see and which a search engine might use as a “representative review” to display for a business or product, or service. If you do there are some things that you might want to keep in mind while writing. At least, according to a patent filing from a couple of Google employees. It isn’t officially assigned to Google at this point. Still, it lists a Google patent application I wrote about last November on Reputations for Reviewers and Raters as a related filing.
The patent describes “quality” signals in the ideal review and has some good advice about what Google might consider the most relevant reviews.
Besides looking for the most relevant reviews, Google seems to like well-formatted reviews.
Run your review through a spell checker and grammar checker – chances are that Google will like that.
A few short sentences aren’t enough, and a few long paragraphs are too much.
Avoid sentences that are either too long or too short and make sure that those sentences have beginnings, middles, and ends – sentence fragments aren’t favored at all.
ALL CAPs are considered RUDE by Google (and by lots of other people online).
It’s also wise to avoid profanity and sexually explicit content in most reviews. We’re told that type of language “often contribute[s] little or nothing to an understanding of the subject and can make the user who is reading the reviews uncomfortable.”
In addition to the most relevant reviews, Google seems to like reviews with appropriate words.
What Should Relevant Reviews Contain?
Relevant reviews should contain “high value” words rather than mostly “low value” words. A high-value word might be appropriate for a particular kind of review, identified from a dictionary of words that might be associated with reviews. For example, if the review is about a digital camera, it might ideally contain words like aperture, image stabilization, DSLR, sensor type, lens system, still image formats.
Or, the search engine might look at the frequencies of words that appear in the review and see if there is a high frequency of less common words included. Here’s how the patent filing describes that approach:
Concerning values associated with words in the review, reviews with high-value words are favored over reviews with low-value words.
The word values are based on the inverse document frequency (IDF) values associated with the words in some embodiments. Words with high IDF values are generally considered to be more “valuable.” The IDF of a word is based on the number of texts in a set of texts, divided by the number of texts in the set that includes at least one-word occurrence. The reviews engine may determine the IDF values across reviews repository reviews and store the values in one or more tables. In some embodiments, tables of IDF values are generated for reviews of each type.
For example, a table of IDF values is generated for all product reviews, a table is generated for all product provider reviews, and so forth. That is, the set of texts used for determining the table of IDF values for product reviews are all product reviews in the reviews repository; the set of texts used for determining the table of IDF values for product provider reviews are all product provider reviews in the reviews repository, and so forth.
Each subject type has its own IDF values table because words that are valuable in reviews for one subject type may not be as valuable in reviews for another subject type.
Another factor that Google might make in deciding which reviews to show for products, or merchants on pages like the Google Place page for a merchant, is whether the review represents other reviews.
How Representative Might a Review Be?
To determine how “representative” a review might be, the search engine might cluster different reviews based on those reviews’ shared characteristics. Those shared characteristics might cover a variety of aspects involving the reviews. For instance, reviews of books ordered from an online book store might focus on the book’s storyline or how quickly the book was shipped, or upon the author, or similar books.
If the reviews include ratings, those might be used to cluster the reviews. Something reviewed might have 8 ratings from 5 stars to 3.6 stars, indicating a positive review. It might also have 3 ratings between 1 star and 2.3, indicating a negative review. And it might have 5 ratings between 3.5 stars and 2.4 stars, indicating a neutral review. Since there are more reviews in the positive range, a review or 2 might be selected from those positive reviews to display, using the kinds of “quality” criteria above.
The most relevant reviews patent is:
Selecting High Quality Reviews for Display
Invented by Kushal B. Dave and Jeremy A. Hylton
US Patent Application 20110125736
Published May 26, 2011
Filed: January 26, 2011
A method and system of selecting reviews for display are described. Reviews for a subject are identified. A subset of the identified reviews is selected based on predefined quality criteria. The selection may also be based on zero or more other predefined criteria. A response that includes content from the selected reviews is generated. The content may include the full content or snippets of at least some of the selected reviews.
Most Relevant Reviews Conclusion
Earlier this week, I wrote about a Google study that explored how reviews from different sources might be aggregated together. It will also look at how they might be compared to one another.
The study and Google’s display of starred ratings and listings of numbers of reviews in both organic search results and in Google Place page listings give us a sense of the importance Google places in reviews. Google isn’t just looking for the most relevant reviews. They are also looking for different quality signals to provide quality reviews.
This patent filing gives us a sense of how Google might choose the most relevant reviews to display. That can be based upon the quality of the reviews and how those might be clustered together to display representative reviews to people who want to access reviews quickly.
It’s also interesting to see how Google might define reviews in terms of “quality,” considering how much that term seems to have filled the search landscape lately.
Google’s quality scores for Adwords advertisements and landing pages have been used to help set a price for sponsored ads within Google’s search results for a while now. I recently wrote about a Google patent filing for web publishers’ pages that would help determine how much those site owners might earn for displaying Google Adsense ads based upon the quality of their pages. In addition, Google’s Panda updates for search rankings and the blog posts that Google has published about it have provided us with guidelines for building quality sites.
Quality can seem like an abstraction. Or an aspect of something that’s both hard to define and measure and that might be subjective enough to mean quite different things to different people. But, we’ve been seeing over and over in different contexts that Google is more than willing to define “quality” features or attributes of advertisements, landing pages, advertising publishers pages, web pages, and now in reviews.
78 thoughts on “How Google Might Choose the Most Relevant Reviews to Display”
I almost always do a search for “product blog” on google to see if I can’t find some reviews by random bloggers. It seems that these are the most trust worthy and most indepth compared to Amazon.
I welcome this change, but I think it might also dilute my product + blog search technique as more and more people catch on and start using it.
Bill, seems like your research is bringing you a lot on the “quality” side in your last 3 articles.
Your first paragraphs are also interesting for the Webmaster of an e-commerce site because it reinforces to have quality guidelines for its reviewers in an AJAX manner (no caps, spellchecking, eliminate sexual content, long but short sentences, etc.).
I can clearly see a future for the SEOs who cracks the quality code will have a much easier time ranking and maintaining them.
the recent changes google made to their algorithm are just obvious. Googleâ€™s target #1 is to get customer (searchers) satisfaction. they get this satisfation through high-quality content in the serps BUT the big problem is that the robots are blind and (of cource) are not human 🙂 So they need to get ranking-factors which correlate well with our human eye and our ability to check in seconds if a site is a high- or low-quality website.
what remains is the following: which ranking factors make the difference if two equal high-quality-websites compete with each other. So we have two tasks in SEO.
1) Build high-quality websites to outrank your low-quality competitors
2) Hava a strong and reliable SEO-Strategy to compete with other high-quality websites
The Big-Thing is â€“ start with 1) before you get your strategy ready 🙂 Most SEOs concentrate on their SEO-Strategy and forget to do their homework at the beginning.
It’s hard not to focus upon quality these days in light of the search engines’ increasing emphasis on the subject, in places like the Panda updates.
Though I do have to say that the phrase “quality score” showed up in a lot of Google patents as a way of describing the combination of relevance scores and importance scores that would determine the ordering of search results at places like Google.
Some of Google’s patents do provide some other interesting definitions for the term “quality score,” such as one on Universal Search (see: How Google Universal Search and Blended Results May Work) which defined a quality score as a unique mix of ranking features for different types of results that could be used to compare different types of results against each other.
A different quality score showed up in a recently granted Google patent that I wrote about when it was initially published as a patent application, in Google’s Query Rank, and Query Revisions on Search Result Pages. That quality score defined the quality of a query, which would be “estimated from user click behavior data estimating the length of clicks on search results.”
I think it has become more important to keep an eye on what Google and the other search engines are saying about quality.
If a blog is genuine, and has trustworthy and credible person behind it, I tend to give it some weight when looking at a review of something.
But, I know I’ve been approached by people who want me to write reviews for their products and services, and I turn them down. Some of them have even offered money, and some have also offered to write the review for me. I’ve also seen blogs that appeared trustworthy that were purchased by others, and continued on, but with “product placement” links suddenly appearing every so often.
So I tend to be more careful these days when seeing a blogger reviewing or endorsing something.
Very good points. I’ve always believed that creating high quality content is an essential element of an intelligent SEO strategy. Being unique, remarkable, engaging, credible, and useful are the kinds of things that will get people to return to your pages inspite of search engines.
I think this is exactly right “If a blog is genuine, and has trustworthy and credible person behind it.” Part of that being credible is not being a shill for junk. For blogs I read regularly I would notice a pattern of shilling. And even for a blog I don’t you can pretty quickly search and get a decent guesstimate for their credibility.
Hi Bill, I’m probably stupid, but I actually had no idea that there were so many spam reviews, or people who actually made up “junk reviews” for merchants. After reading this, I think what Google is doing is definitely a good thing. Most people will buy pproducts based on good, word-of-mouth reviews, especially during the current economic times. Hopefully, this change will make it a lot easier for consumers to get their money’s worth for products bought online. Thanks for the well-written article.
I’m a big review hunter before I buy but some are clearly spam reviews or reputation management agencies just bulk submitting reviews. I think it can only be a good thing if Google pics up on these and distinguishes the repeated a million times review from the genuine in depth and honest reviews that have taken time to create.
I recently submitted a review for one of the real estate client and quite luckily i mentioned about the service offered and the ease of finding a place. This patent will actually cut down the spam reviews that are automatically sent by bots and thereby helps the customers to choose the products or services much more easily.Nice post to cover such an important aspect.
I’ve worked extensively with reviews and this approach to quality is … troubling. Spelling, grammar and paragraph construction may not always be indicative of a quality review for a couple of reasons.
First, not everyone has these skills. A quick look at any comment stream or a simple scan of craigslist makes this abundantly clear.
Second, reviews are generally not given as much attention as other forms of writing. It’s a very transitional and ephemeral piece of content, with somewhat limited investment and ownership.
Even those written on blogs may not fit this definition of quality since they are often there in support of a MFA (AdSense or Amazon) site, which necessitates a slightly different take on wording and text length.
While I would very much like to think that everyone aspires to produce such ideal content, I believe it’s challenging with reviews and by applying such restrictions Google may lose out on some very important and informative reviews.
I couldn’t agree more with Bill. I am a believer of the basics. In terms of SEO, I always believed that great content is what people would drive back to the site. Same with reviews, if you get noticed by a wider audience as someone who makes substantial and competent reviews, people would always come back to read more of yours. Great content is really a big deal. I mean one can drive as many people to their site following varied techniques, but what would make visitors revisit the site? – great content that is.
Bill, do you have any ideas about how representative a reivew is? I’m thinking it should be rather tough to create a universal algorithm for this and Google ha a history of prefering universal to actually working specificly within the segment. How would someone go about comparing the storyline to the review of the book?
I’m working myself on a number of review sites and I’ve been having some problems with this so any sort of input is appreciated. 1 thing I’ve bee running a number of tests with is LDA but I can’t really see any good results from this.
As was reading this post and all the potential requirements therein with respect to a Google-friendly product review, I couldn’t help but remember that link you posted to the site that gave your page-copy a quality score. Do you remember what it was? It assigned your page a score that basically told you at what educational level you write at and who you essentially write for.
After this post, I am thinking about running all of my posts through that checker.
Additionally, the comment about grouping the reviews with varying ratings together just gave me a seriously BH idea…(oops)…
For my opinion Google puts more weight about where the reviews are coming from rather than how were they written. If you’ll do a quick check on the Google’s Place pages 7 packs , you’ll see that most of the reviews are coming from the same 5-6 directories- Regardless of the reviews formats
Great information that we otherwise might have missed — thanks!
Some commenters have expressed concern about Google’s use of spelling and grammar to judge content. The reason search engines have to rely on things like grammar and usage errors is simply that they don’t have enough control over human languages to be good judges of style and content.
Interested in the ‘high value’ words point. Of course some subject knowledge may help, but in many situations the ‘ordinary’ man may not have the technical words to hand and certainly in something like wine tasting it could all become rather comical.
Useful information, thanks Bill. What do you think about the quality of the source of publication determining the quality and accuracy of the review? i.e. will more reputable websites have higher quality reviews?
‘High quality’ reviews that follows proper grammar and phrasing will reduce the amount of offshore fake reviews that’s for sure!
On the other hand , it require us as marketers to pray really hard that our review acquisition effort wont go for waist when clients just write a low quality review
Your analysis makes common sense which may be what Google is pushing. Reviews with little substance will not be prioritized while reasonable indepth reviews have a chance. I doubt Google is pushing for PhD level reviews, but they are prioritizing reviews with correct spelling, reasonable analysis, correct punctuation, and readability. Interesting that this was some of what I learned in composition but these basic skills are being lost with instant communication. 30 years ago your ability to present a position was imbedded in how well you could write an interoffice memo. Today that ability rests in email, but too much email is poorly written and seldom proof read. I wounder if Google is just grading the quality of the composition much like our past composition teachers.
I think it’s interesting and little bit disturbing that Google continuously tries to quantify quality. However, I DO see the upside of this particular effort to increase credabillity and relevance in product reviews. Nowadays you’re more likely to buy a certain product based on reviews and forums rather than information from the company selling the product.
I wonder if these reviews will be selected manually from a list compiled by the “robots.” Without a manual, human review, I’m afraid that the content still won’t be the best available. How are they going to quantify quality anyway? Once the formula gets figured out, it will be replicated by the content spammers and we will be right back where we started.
My main concern is that virtually every good review i’ve ever read has contained loads of profanity!
I agree with Petter…. it’s a little bit disturbing. The logic behind it is understandable, but just because a reviewer’s spelling, grammar, language etc doesn’t pass the quality test of a computer program is their opinion any less valid?
I agree with AJ Kohn on the point that reviews are different from website or blog content–and that maybe there should be different “quality” standards for them. Spam is spam, but different kinds of quality content can’t really be judged on the same measures, which makes efforts get rid of spam and promote “quality” content tricky.
I was actually reading today on how to provide review data to Google via microdata.
It appears you can provide code on your webpages to offer google review data on just about anything. This would be especially important for product pages or service businesses. Very interesting stuff.
I could see this going well. It would certainly be beneficial for reputable online merchants because sometimes the reviews displayed are irrelevant or written by malicious intended competitors. I think negative comments by competitors will be difficult to decode by Google, though. Reviews can be finicky at best but, I think but if they are done correctly it could help online consumers.
I respect that some bloggers might want to maintain a level of anonyminity, and that others may just not want to share too much about themselves, but even with those sites where you can’t find out too much about the people behind the site, it only takes reading through a few posts to get a sense of what motivates and inspires them.
I like looking at reviews before I buy things like cameras and televisions and other electronics, and health related products. If Google can pull out the “genuine in depth and honest reviews” reviews, and choose to display those, I’m all for it.
Thanks. I’m not sure that I’ve ever seen any statistics about how many spam reviews there are, and it can sometimes be hard to tell whether a review is genuine or something written for economic gain or inspired by some bias. Sometimes reviews do seem too good to be true, but it’s possible that people do really like the product or service that they’ve received enough to motivate them to write something glowing. And sometimes they are too good to be true.
I discovered a lot of great music and books from reviewers whom I trusted long before the Web made reviews so prominent. It’s a lot easier for people to write and share reviews these days, and we probably have to exercise a lot more skepticism because of that. Google finding ways to try to gauge the quality of reviews is a positive step in my eyes.
Hopefully an approach like this will make it less likely that spam reviews and paid but undisclosed endorsements will be prominently shown by Google on pages like their Google Places pages.
I’m not sure how many authentic and legitimate reviewers of a product or service really see themselves as “reviewers” or are concerned too much about how a search engine like Google might rate the quality of their reviews. I’ve written a handful of reviews for local shops in my area, not because I wanted to promote myself as a reviewer, or because I was paid to do so, or asked. I just really like the places enough to spread the word. I wish there were more reviews of merchants where I live, with honest and authentic reviews. It’s a little disheartening to see a local auto repair place that’s been around for more than 50 years, that does a ton of business, and when I look them up online, I can’t find a single review of their business.
I have noticed that on Google shopping results the reviews are getting a lot more prominent recently. It would be good for Google to present a representative review but the problem is that with so many reviews and opinions on things across the web sometimes you come away more confused than when you started. Even looking to buy something as simple as a mop can leave you daunted and not knowing which one is really best. Hopefully Google can create a good way of choosing the best one.
This approach isn’t going to eliminate or mask or hide reviews, but it might make some more prominent than they otherwise might have been, somewhat like the “positive” and “negative” reviews that Amazon will show at the top of a page of reviews. How do they decide which to place there?
Given a large number of reviews to choose from, how should Google decide which reviews they should put on something like a Google Place page? I like that they seem to have some kind of plan in place for choosing those, but I do agree with you that many genuine and helpful reviews aren’t always written as if the reviewer were F. Scott Fitzgerald or Ernest Hemingway, or even Lester Bangs or Robert Christgau.
It’s probably worth spending some time actually studying the reviews that Google does show off on Google Places to see how closely they might be following the ideas presented in the patent. I know that they’ve been looking at presenting sentiment in reviews as well, and this patent doesn’t mention that aspect of reviews at all.
Thanks. One thing that I’m concerned about is that I’m usually much more interested in the products or services that I’m looking for reviews of then I am in finding a reviewer who writes using good grammar and spelling and produces great content. I’d honestly rather see an honest and insightful and informative review than one where everything is spelled correctly, but doesn’t offer much else.
By “representative,” I believe that Google is attempting to cluster reviews together, and then pick out the highest “quality” reviews from the most representative clusters. For example, given a 100 reviews for a book, 30 of those might primarily be about the plot of the book, 25 might be about the author’s writing style, 20 might be about how quickly the book was shipped to their homes, 15 might be about other books that were written by the same author, and 10 might be about books written by other authors that are similar. So, those reviews might be clustered around those different topics, and “representative” reviews might be selected from the most popular of those – maybe one about the plot of the book, and one about the author’s writing style.
Replace “book” with a plumbing service. 40 reviews about a specific plumber might be about how much or little that plumber charges. 25 might be about the quality of his or her work. 20 might be about other plumbers in the area, and the last 15 may cover unrelated topics.
Clustering to determine groups that representative reviews might be chosen from could also be based upon ratings associated with the reviews, so a book that receives 10 five star reviews, 2 four star reviews, and 10 two star reviews might have those reviews clustered around those ratings, and representative reviews might be selected from amongst the reviews with 5 stars and from the reviews with 2 stars.
Hopefully people won’t start deciding what rating to give a product or service based upon the other ratings for those, in hopes of their review being a “representative” review. I’d rather they would rate based upon how much they liked or disliked the product or service in question.
I did write a “test your blog” post that had a few different tests on it, including a reading level test, and I’ve seen a number of sites that will give you a rating score based upon things like how SEO optimized your pages are. I don’t think it hurts to run your site through things like a reading level checker, but we both know there’s more to writing for a specific audience than what an automated program might tell you about your writing. 🙂
There is a very real possibility that the source of most reviews is as important, if not more so, than the quality of the review itself. The post I wrote before this one was on a Google study that did some comparison of different review sources and how reviews from those sources might be compared against each other – See: Google Research Paper on Online Reviews for Merchants and Products.
In addition to the quality of the review itself, Google might be looking at the source of the review and well as things like how trustworthy the reviewer might be – have they published other reviews before, do they seem like they might tend to rank everything highly or everything negatively, etc. Also worth a look is my post on a patent from Google that is related to the one in this post: How Google may Manage Reputations for Reviewers and Raters
Good points. It’s much easier for a computer to do things like count the number of words and sentences and paragraphs in a review, the lengths of sentences, the variety of terms used and how common or rare those might be, how related those words might appear be to the subject of the review (how often the words tend to show up together on documents that might be found on places like the web), how well words are spelled, and how grammatically correct the content of the review is, then for the search engine to judge the style and actual content of the review. Google has devoted a lot of energy and resouces toward building statistical language models that might help them evaluate aspects of things like reviews, but there’s only so much a computer might be able to do.
Sometimes the best review of a product or service might only be one or two words long. 🙂
There’s definitely room for lots of different opinions from lots of different people when it comes to reviews. One person might like reviews of books from literary scholars, while others would rather see reviews from people who will only pick up a book for something to do when they’re laying out on the sand at the beach. It’s quite possible, given the language in the patent, that Google might bias how they choose reviews about books based upon unique terms that only subject matter experts might use when they write about a topic. It would be interesting to collect some data about the reviews that they do choose to see if that’s going on.
An approach like this might help to filter out, or at lease make less prominent, reviews that are hastily written, and paid for by others based upon quantity rather than quality, rather than submitted because someone wanted to share their opinion of something.
It’s quite possible that Google will consider the source of a review as some aspect of deciding which to show. See my comment above to Yoni about some related posts that discuss that more.
Good points. How would a teacher decide between work produced by students to show a visitor what their class has been learning recently? Possibly some of the same things that Google is pointing out with this patent filing.
I think a large part of what Google does requires them to determine the quality of content that they display to searchers, based upon automated methods that are used to decide between large number of sources. It’s not surprising that Google would apply similar approaches to decide which reviews they might show to be representative of other reviews.
I agree. 🙂
I suspect that Google does check on some of the reviews, as sort of a quality control type approach, but the majority are probably chosen without some kind of manual intervention. There are just too many reviews to check each one, and that’s not usually how Google operates if they can help it.
Besides, increasing the cost of attack by spammers, by forcing them to spend much more time and effort on individual reviews, on building the reputations of reviewers, by making the threshold so high that it’s more work to spam reviews than it is to do something else is not an uncommon approach in other areas of addressing web spam from Google, Yahoo, and Microsoft whereever they may find it.
That may just be the case. 🙂
I’m not sure that Google sees what they are doing as judging the validity of reviews submitted by reviewers, but rather as selecting amongst a set of reviews to showcase that might inspire people to click and read more of the reviews submitted by others. If they choose reviews that might not be very well written as representative of the body of reviews for a particular service or good, those might not get as many click throughs.
Good points. In some ways, that’s similar to the challenge that Google faced with Universal Search, and deciding where best in search results to show images or news articles or book results or web pages. Comparing the different types of content against each other is like comparing reviews that might appear on Yelp or on a blog or in an academic journal. I don’t think it’s an easy undertaking.
The search engines are trying to provide ways for us to uncover information for them, and microdata can help do that.
Though, there were a good number of food bloggers who were upset that microformats weren’t the easiest thing to get your hands around when Google added recipe search. Recipes from larger sites, with money to hire more technically proficient designers, were showing up more prominently in those recipe searches. I’m not sure that’s still the case, but people were releasing plugins to make it easier for food bloggers to present recipes in microdata formats, so hopefully that’s helped.
I’d actually be happy to see Google rely on the primary method that Amazon uses to rank those reviews: customer feedback. The ‘Was this review helpful to you?’ responses provide a solid feedback signal to surfacing the more valuable reviews. At that point, it’s just a matter of selecting the most helpful positive (4-5 stars) and negative (1-3 stars) review.
Yet, there are some limitations to this methodology, particularly around recency of the review and providing newer reviews an equal opportunity to accumulate votes. So, in many ways there’s a halo bias in presenting and soliciting reviews in this manner.
I too think it would be interesting to look at the reviews on Place pages and see how closely they match the ideas in this patent and others (including the source of that review.)
In the end, I’m for systems to help better reviews rise to the top, but I remain unsure that such ‘academic’ standards are always the best method to do so. For instance, the reviews from contractors on building materials might not qualify as quality reviews, but that front-line review might be more valuable than the well written review by a retailer or distributor, or misguided review from a DIY dilettante.
I’m confident Google will resolve these subtleties given the interest and focus they have on reviews.
Interesting thoughts. We do know that Google is trying to consider sentiments expressed in reviews, even though this patent filing doesn’t address that. Matt Cutts talked about that in Google’s Searchology event two years ago, and Google has published at least one detailed whitepaper about it. I wrote about those in Google’s New Review Search Option and Sentiment Analysis
I’m not sure how well that has helped Google with malicious reviews, but I am somewhat surprised that the patent filing I’m writing about in this post didn’t address sentiment.
I hadn’t notice an increase in the prominence of reviews associated with shopping search. Thanks for pointing that out. Reviews can be helpful, and they can be daunting sometimes too. I guess for people who do want to see reviews, making them easy to find is probably a good thing, though.
User ratings on reviews the way that Amazon does them are helpful, but I think you’re right that there are some limitations to the effectiveness of solely using an approach like that.
If I want to find out about which place I should bring my car to get it worked on, I’m more likely to ask friends who know something about cars, regardless of how well they might write. I’m not sure that the threshold that Google might have in place using the quality signals they mention are intended to filter out people as much as they are to try to make the reviews they display appear more attractive (at least on the surface) to people searching for reviews.
I’m a web designer by trade that tries to keep up with SEO when I can. Anyways, I usually set up Google Places, Bing Local, Yelp Biz Listing etc… for clients after I built their website.
I have 1 client I’ve had for over 1 year. I built his website and set up a Yelp Business Listing. I advised him to get his happy customers to write reviews and left it on him months ago-of course he was too busy to do this. Now all of a sudden he gets 2 bad reviews and is scrambling to get his happy customers/friends to write good reviews. He got 3-4 friends to write reviews and after 2 days Yelp took down the good reviews and only the bad ones are still left. I told my client I assume it looks really fishy to Yelp (and other review enabled directories) that after 1 year of no reviews that all of a sudden you get 2 bad reviews, then you get 3-4 good reviews… is my assumption right? What can he do? What’s the steps he should take? I think if his friends write other honest reviews, with the tips you suggested, then they can build trust with Yelp as a reviewer and their initial review of my clients biz may be reinstated???
I really like your comment about making ‘the reviews they display appear more attractive’.
I think that may be large part of the impetus because of the idea behind perceived relevance. A review might not be the most helpful, but at a glance (which is about what you get on the Internet these days before someone makes a decision) having a review that is perceived as valuable might attract engagement.
As always thanks for your continuing service to the SEO industry and raising the bar on SEO dialog.
Good to hear that you’re paying attention to the SEO aspects of getting pages online and helping your clients get more visibility.
The Yelp reviews that were written in response to the negative reviews that you mentioned probably did look pretty suspicious to the people at Yelp, and I agree that they probably look pretty fishy, even if they were honest and authentic.
Not quite sure how Yelp might perceive additional reviews from those particular reviewers after their initial reviews, but it’s probably more likely that they will respond positively if they see a legitimate interest on their behalf in providing helpful and honest reviews.
Thanks for your kind words.
The perception of relevance does seem to be a key part of the decision on which reviews to display. I didn’t write about the section of the patent that describes how Google might generate snippets for the reviews chosen, but how they are displayed was given as much attention in the patent filing as which reviews might be displayed.
Some excellent information in the article and follow up comments. My brother does web design and I do SEO. We use WordPress for our websites and encourage clients where possible to include a blog on their website so they ahve the freedom to login and update there latest news or send the details over to use and we do it for them.
This ensures that the website never becomes stagnant plus the user will benefit more with upto date news.
For directory purposes I use http://getlisted.org where I list the clients business ont he top 5 directories in the UK, Such as Qype, Brownbook, Google Places, and Hotfrog. This is a great way to market your business and give you very good exposure for free. Also it will give you a lot of relevant backlinks to your website which should help you in the SERPS.
Thank you. I like the approach that you and your brother are taking.
WordPress has made it a lot easier for many small and medium sized businesses to build presences online at much more affordable rates than in the past, and it makes it easier for them to update and maintain their sites.
Getlisted is definitely a useful resouce.
Yes I agree and truth is, that should be the case when it comes to reviews. What’s more important is that readers are able to understand what the review is “really” about. Forget the spelling or any other typo error. As long as the review was able to deliver what it should, that’s it.
On the lighter side of things, Man it’s kind of hard to follow the thread. I couldn’t respond to other comments without going all the way down. 🙂
Thanks – This patent filing isn’t really about discounting any of the reviews that a business or product might received, but rather deciding which reviews might be displayed the most prominently by Google when it gathers them from a variety of resouces.
As for following the comments, I’m torn over whether or not I should use nested comments. I might start doing that.
Noted. Thanks Bill.
You’re welcome, Noel.
great post Bill, from start to finish, and great comments that complete it.
I knew that the reviews of the products were important for positioning but did not know that Google increasingly given more importance. I conclude that Google will evaluate the quality of a review of a product or service according to the general tone, the spelling and the grammar. Although I think there will still be positive or negative comments deliberately.
Thank you. It is quite possible that Google will show both positive and negative reviews on purpose. Quite likely using a sentiment analysis, so that they can get an idea of whether or not a review is positive or negative. I think a good place to see a site doing that well is Amazon, where they will show both positive and negative reviews at the top of a review page, to show people the scope and variety of reviews that something might have been given.
What is also interesting is how this could also give Google quality indications not just for reviews that influence Google Places rankings but how Google could judge all content on the web.
From my experience this is certainly not true yet, but certain levels of this potentially apply to gaining SERP viability for long tail search queries.
Although if all of these factors were applied to determining the quality of pages for Google standard search results, Google would be expecting its users to be writing more intelligent search queries :p
It’s possible that Google is doing something somewhat similar with its Panda updates – using a machine learning approach with a seed set of “high quality” pages to rate and rank other pages on the Web. I’m not sure that doing that would require searchers to write more intelligent search queries, though.
I agree with your post Bill.
I have noticed that Google gives more importance to high quality content in which there are no any mistakes in grammar, speckling and high value words are use in content. I think the best example is mattcutts blog.
I’m not completely convinced that we can make the correlation between grammar and spelling and the use of high value words and the rankings of pages in search results.
The post itself is about the reviews that Google might show as representative reviews when it does display those, and it makes sense that they would be concerned about those types of things if they were going to present reviews in prominent places.
Things like correct spelling, grammar, and so on might definitely be indications of quality on a page, but I’m not completely convinced that they are the most important things that a search engine should consider when ranking pages in response to a query.
Iâ€™m a big review hunter before I buy but some are clearly spam reviews or reputation management agencies just bulk submitting reviews. I think it can only be a good thing if Google pics up on these and distinguishes the repeated a million times review from the genuine in depth and honest reviews that have taken time to create.
I like looking for reviews before I buy as well, but I’m definitely going to try to find more than one, and try to understand when I see serious differences of opinions between those reviews. Spam reviews are definitely a problem, and relying just upon one review for something is risky.
I think Google takes into consideration the source of reviews. It looks for where the reviews are coming from. I always go for reviews before buying anything. I think Google is doing great in improving its algorithm to get rid of spammers.
I agree with you, and a somewhat recent whitepaper from Google described how they were comparing different review sources to try to gauge how much importance they should give to reviews from different sources. A lot of searchers seem to want to see reviews before they buy something, so it makes sense for Google to spend some significant time and energy upon them.
See my post, Google Research Paper on Online Reviews for Merchants and Products for more details, and a link to the paper.
Will Google use this patent to also list Hotel reviews? So far what I have seen is that Google always gives importance to hotel reviews listed on Google Maps. These reviews are nothing more than 2 liners and hardly provide useful information.
The pending patent I wrote about in this post appears that it could be used for all kinds of reviews, from reviews of products to reviews of businesses of many different types. It’s focus appears to be on finding the most representative review from amongst the ones people had submitted for the sites in question. If all or most of the reviews that have been submitted tend to be not all that useful or long, they really just don’t have a lot to choose from.
Comments are closed.