Google was granted a patent this week that describes how web sites might be given quality ratings, based upon a model that looks at human ratings for a sample set of sites, and web site signals from those sites.
The patent tells us that the advantage of such an approach would be to:
Provide greater user satisfaction with search engines
Return sites having a higher quality rating than a certain threshold
Ranking sites appearing in search results based upon quality
Identifying quality sites without having a human view the site first
This patent was originally filed in 2008, and the use of quality signals sound similar to what Google has shared with us regarding the Panda Update. It’s more of a search quality “improvement” than a web spam penalty.
The patent uses blogs as a type of site that it can be applied to within its claims and description section. One of the inventors, Christopher C. Pennock was a Senior Software Engineer on Google Blog Search, according to an early 2009 SMX Session with him which discusses ranking signals in Blog Search.
On May 1st, Google’s Head of Webspam Matt Cutts published a video in his series of Google Webmaster Help videos, answering the question, “What’s the latest SEO misconception that you would like to put to rest?”
For some reason, Matt decided to focus upon patents, with a video about people possibly placing too much faith in what is uncovered in patents related to search engines. To a degree, I agree with his response, but I was reached out to by a number of people who saw the video as something aimed specifically at me, since I write about search related patents so often. I felt that I had no choice but to respond. Here’s the video from Matt:
In an ideal world, your site architecture should be set up so that search engine crawlers are only able to visit each page of your site at one web address, and no more. You may be laughing, but when Google sends you the “I give up, your site has too many URLs” message in Google Webmaster Tools, you won’t be then. Seriously.
Keep Colors and Sizes Together
If you create multiple product pages where the only thing different is offering the product in red or green or blue, or small or medium or large, you are creating too many pages. True when you decide to let “email a friend” pages get indexed, and “Add to my wishlist,” and “Compare Products” and other pages that Google doesn’t want in its index either.
On September 8, 2011, Google filed a patent named “System and Method for Confirming Authorship of Documents,” (U.S. Provisional Application Ser. No. 61/532,511). This provisional patent expired on September 9, 2012 without being prosecuted. A day later, on September 10th, Google filed two new versions of the patent, using the same name for both of them. Google’s Othar Hansson’s name appears on both as lead inventor, and the description sections are substantially similar, with a couple of very small changes.
The claims sections of the two patents are different, however. The first patent application (US20130066970) describes a link based approach to claiming authorship of a site, or being a contributor to that site. The second patent application (US20130066971) describes an email based method of claiming authorship (or of being a contributor).
The approaches described in both patent filings appear to be substantially similar to the instructions that Google describes in their help pages starting at Author information in search results
In 2006, Google battled Yahoo! and Microsoft for an algorithm developed by an Israeli Ph.D.student in Australia. The algorithm had a semantic element to it, and advanced Google in an algorithm arms race between the search giants (one of which doesn’t even have a search engine of its own now). We’ve seen the technology described in terms of how it is displayed in search results, but not how it does what it does. Until now.
Google was awarded a patent this week that looks at search results for specific queries and the entities that appear within them, to produce query refinements. This invention is from Google, but the lead inventor behind it was part of a bidding war between Google, Yahoo!, and Microsoft. In 2009, the breakthrough was made public on Google in the form of Orion technology.
The Orion approach involved both extended snippets for queries (three or more lines of descriptive snippet instead of two for some longer queries), and “more and better query refinements.” How this technology is displayed is described in a Google Official Blog post from March 24, 2009 titled Two new improvements to Google results pages.
It doesn’t do any good to rank well in search results if no one clicks through.
If you go to Google Webmaster tools, and see the list of queries a page of yours might rank well for, you might see some query terms or phrases that you want to show up in search results. Webmaster Tools will show you how many “search impressions” your page might receive, as well as how many people have clicked on it when they have seen it. So what if your page has received 10,000 search impressions for that term or phrase, but only 50 clicks?
One question you should probably ask yourself is if the term or phrase is one that is satisfied by your page.
Sometimes a query term has more than one meaning, and most people searching for it might be looking for a different meaning. For example, you create a page about Java the drink, and most searchers may be looking for the programming language.
When you perform a search at Google, and you have a set of search results in front of you, how do you decide what to click upon? How do you judge the page titles, the snippets, and the URLs that you see. How does Google decide what to show you? A little more than a year ago, Google Webmaster Trends Analyst Pierre Far wrote on the Google Webmaster Central Blog a post titled Better page titles in search results. There he told us that Google might sometimes rewrite the titles for web pages when showing them in search results. The post told us that Google might do some changing of titles when those had generic titles such as “home”, or no title at all, or:
We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages.
Before we consider how Google might decide when and how to change page titles (in a follow up post to this one), there’s another question about search results that needs some exploration.