On Tuesday, June 23, 2015. Barbara Starr and I are giving a two person Presentation to the SEO San Diego Meetup, SEM San Diego Meetup, and Lotico Semantic Web San Diego Meetup Groups, titled Ranking in Google Since The Advent of The Knowledge Graph.
Barbara and I have been looking at a lot of patents while preparing for the presentation, and one of the topic areas that we were going to discuss was Quality Scores, since one of the patents that mentions adding “Buy Now” buttons to paid search listings in search results, may do so only if the sites being considered to show buy now buttons have a high enough Quality Score associated with them.
While preparing, Barbara pointed out another patent to me that focuses upon low quality scores. It describes how a site might lose traffic if ranking scores for links pointed to it are below a certain threshold.
This patent aims at providing a quality score to resources that link to a site; counting the number of resources in each group, determining a link quality score for the site using the number of resources in each resource quality group; and possibly determining that the link quality score is below a threshold link quality score and classifying the site as a low quality site because the link quality score is below the threshold link quality score. Being classified as a low quality score site can result in a decreasing of the ranking score for that search result by an amount based on the link quality score, following a formula listed in the patent.
The patent points out a few things that might lead to a lower quality score based upon links point to your site. These can include links to your site from boilerplate content, and links from redundant material from the same sites, pointed to your site. Enough links like these, and they might result in a “low quality site” label for your site.
It’s difficult to tell if a patent like this one has come into effect, or has had an impact in any way, though reading articles like the following one makes you wonder if Google has brought the details described within this patent, and similar ones, into effect. The article is, Phantom 2 – Analyzing The Google Update That Started On April 29, 2015
Linked to from Boilerplate
Many sites have content on them that could be considered boilerplate content. This could include footer content and sidebar content that doesn’t add much value to pages on a site, and is copied from one page to another. This can also include copyright notices, links to things like sitemaps, and others. It appears that when someone links to another site from boilerplate sections of pages, Google may view that as an indication of low quality for the sites being linked to.
The patent also discusses something it calls Diversity filtering, which is a process for discarding resources that provide essentially redundant information to the link quality engine. If your site provides a resource that links to a specific site, and does so on a number of pages, Google may only count one of those links, and discard the rest of them. It may count those filtered links against the site being pointed to, again as a sign of low quality.
Advantages of the Patent
The advantages that following the inventors of this patent tell us it brings is:
- A search system can determine a link quality score for a site using a distribution of resource quality scores of resources linking to the site.
- The search system can classify the site as a low quality site if the link quality score is below a threshold score.
- When providing search results, the search system can decrease the ranking score of a site classified as a low quality site, which can result in higher quality sites being provided to users instead of low quality sites.
The patent is:
Classifying sites as low quality sites
Publication number US9002832 B1
Publication date Apr 7, 2015
Filing date Jun 4, 2012
Priority date Jun 4, 2012
Inventors Rajan Patel, Zhihuan Qiu, Chung Tin Kwok
Original Assignee Google Inc.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for enhancing search results. In one aspect, a method includes receiving a resource quality score for each of a plurality of resources linking to a site/ Each of the resources is assigned to one of a plurality of resource quality groups, each resource quality group being associated with a range of resource quality scores, each resource being assigned to the resource quality group associated with the range encompassing the resource quality score for the resource.
The number of resources in each resources quality group is counted. A link quality score is determined for the site using the number of resources in each resource quality group. If the link quality score is below a threshold link quality score, the site is classified as a low quality site.
This isn’t the only patent from Google that focuses upon reducing rankings to sites based upon a low quality score for that site. The Panda and Penguin updates from Google both seem to do that. I’ve also written about a few others that involve low quality scores as well, such as:
I wrote one at Moz on this topic, too:
The argument that Google seems to make about decreasing rankings for sites that seem to be low quality is that doing so raises the rankings of sites that are higher quality.