Google Patents Click-Through Feedback on Search Results to Improve Rankings

I was excited to see a Google Patent granted this past Thursday, which describes how Google may rank pages in part based upon user feedback (clicks) in response to rankings for those pages. The patent tells us that this kind of identifying of a user’s needs and determining which documents are returned that might be most useful to a searcher can involve “a fair amount of mind-reading—inferring from various clues what the user wants.” But, we’ve been told recently by a Google Spokesperson that such clues can be misleading. I thought it was still worth pointing the patent out.

Google may capture search click results.
Just one click away…, Matthias Ripp, Some rights reserved

Some clues may be user specific, the patent authors tell us, and when a searcher searches from a mobile device, and Google know the location of that device, the results returned “can result in much better search results for such a user.” That does make sense.

Another clue Google may consider, they share with us, is that some pages may be linked to by a number of pages that are results for a query, and those “linked-to” pages are often good responses for that query as well. The patent tells us, “if authors of web pages felt that another web site was relevant enough to be linked to, then web searchers would also find the site to be particularly relevant. In short, the web authors “vote up” the relevance of the sites.”

The focus of this patent isn’t on those linked to pages, or on searches made from mobile devices, but instead tells us that Google may monitor responses to particular search results, to see what is clicked upon among those, and the results users often click upon will receive the highest rankings. They tell us that, “The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.”

This seems to go against a statement made a month ago by Google’s Gary Illyes, as reported in the blog post How Google Uses Clicks in Search Results, According to Google. He responded to the use of click throughs like that as being too noisy a signal to rely upon for ranking purposes:

He does say they see those who are trying to induce noise into the clicks and for this reason they know using those types of clicks for ranking would not be good.In other words, CTR would be too easily manipulated for it to be used for ranking purposes.

This patent does looks closer at such a click-based feedback monitoring behavior to understand when such input should possibly be used to boost the ranking scores of particular search results, in the context of a search for a particular query. Is it used by Google? We don’t know. From Gary Illyes statements, it doesn’t seem likely, even with the recent granting of this patent.

Implicit User Feedback Model

The patent tells us that it might adjust how much relevance such a click might carry by comparing a change over time in user selections of the document. It does talk about a “background population click trend model” that clicks might be compared to, but the statement from Gary Illyes tells us that some clicks for certain queries may be made in a manipulative manner, as noted in the blog post carrying his response to this question:

It is well known that you can buy people and bots who will specifically click on links in the search results in a way that makes it appear your site is the better one and your competitor’s site is not. It is pretty easy to find those that do this kind of thing.

Weighting of User Selections after a Threshold Reached or Passage of Time

It appears that the patent anticipated some noise in clicks; but not necessarily of the level that Gary Illyes tells us about. The process in the patent wouldn’t start counting the number of user selections until after a certain threshold is reached, such as a hundred selections and might weigh such selections differently within a certain amount of time. such as within a certain number of weeks from such results being chosen by searchers.

The patent is:

Modifying search result ranking based on a temporal element of user feedback
Publication Number: 09092510
Publication Date: 28.07.2015
Grant Date: 28.07.2015
Inventors: Robert J. Stets, Jr., Mark Andrew Paskin


In general, the subject matter described in this specification can be embodied in a method that includes: obtaining user feedback associated with quality of an electronic document; adjusting a measure of relevance for the electronic document based on a temporal element of the user feedback; and outputting the measure of relevance to a ranking engine for ranking of search results, including the electronic document, for a search for which the electronic document is returned.

Obtaining the user feedback can include receiving user selections of documents presented by a document search service, the method can include evaluating the user selections in accordance with an implicit user feedback model to determine the measure of relevance, and adjusting the measure of relevance can include adjusting the measure of relevance in accordance with the implicit user feedback model

Take Aways

The patent does list a number of advantages of an approach like the one it describes, but the statements from Gary Illyes seem to make it unlikely to be used. This is the first patent I’ve seen from Google that is so much unlikely to be used, but I am often asked about how easy or difficult it is to tell whether or not a patent I’ve written about has been implemented. So, I wrote about this one to show that sometimes the patents are possibly wrong.

This patent may have been granted this week, but the statement from Gary Illyes, made in June makes it unlikely that Google will use the process described in this patent; that Google looking at click-throughs for ranking is unlikely. I do usually search to see if there has been any news about processes described in patents before I write about them.

There was at least one positive take away from Gary Illye’s statements. Google may use clicks to explore and disambiguate queries for personalized results. So, when a query could cover more than one type of category, clicks might help Google decide upon which category Google may use to influence future searches for individuals, to give them personalized search results.

Rand sent me a tweet telling me that his experiments show a different result than what Gary Illyes said about the power of click-throughs:

Rand's tweet in response to my post, about his experiment.
Rand’s tweet in response to my post, about his experiment.

And Rand remarked upon what I said about the Threshold described in the patent as well:

Threshold on click rates tweet.

So, maybe the patent is describing something being used at Google now? It’s possible.

37 thoughts on “Google Patents Click-Through Feedback on Search Results to Improve Rankings”

  1. Thanks, I think ranking upon users experience is very good idea, but also google must fight spammer who will try to abuse this feature.

  2. What a very interesting Patent. I’m also very interested in Google’s Patent and I like reading and writing about them too.

    So from my own understanding of Google and my experience in SEO, I do have tendency of not believing 100% of what Google spokespersons say, like Matt Cutt or even Gary Illyes, when they ”declare” that Google is not taking this element or the other in their algorithm and so on…because my experiments (done on a small scale) and those done by our top SEO experts in our industry (in a bigger scale) did indeed prove them wrong on a lot of Ranking factors that they said Google didn’t look at… but in testing, as Rand proved it in his Tweet, these factors were actually influencing the SERPs results.

    Anyways, thanks Bill for sharing your research and thoughts. I think I’m just tired of Google sending us mixed signals. Seems like they try, most of the time, to confuse everyone (experts included) in order to make SEO looks like an ”untameable beast”🐯.

    Wishing you the best!
    Amel 👓 ☺ |

    P.S. I did write about Google Social Signals Patent here. I’d love to get your expert opinion on it (one sentence will suffice!). Thanks in advance.

  3. Very thorough and useful reading. I wonder why Google applies for patents that it will never implement?

    One question, if Google is using this patent in its search, will it be implemented on all search categories in general or will it be used for specific searches?

  4. Hi Usman,

    Thanks. We don’t know for certain whether or not Google will implement the technology described in this patent; or if the public statement made by Google was intended to mislead people into believing that Google wasn’t using feedback like described in this patent. There’s nothing in the patent that says it might be limited to certain search categories, but sometimes Google might add things to how they implement what may be described in a patent.

  5. Ok but specifically speaking of the download pages which offer download stuff, people usually click the download link and then the back button to go back to Google. Will this be seen as a negative point from Google? How can this situation be avoided?

    I just started reading your blog recently and have not dig deeper in the archives. If you have written anything on this topic previously, you could point me out in the right direction.

  6. Hi Usman,

    This patent is about what people click on in response to a search for a particular query. It doesn’t mention quick visits or bounces, nor does it provide any information on how Google might treat user behaviors like that.

  7. This just seems like more evidence that Google is likely using some sort of click data as part of a re-ranking engine.

    This particular patent seems to be aimed at ensuring that the re-ranking engine doesn’t over-weight historical click data, thereby allowing more recent click data to have a more profound impact on the re-ranking engine.

    This would make it more responsive and protect against a calcification based on old click data.

    Looking at the inventors, they’re both long-time (and current) Googlers, with one having a very interesting description on his LinkedIn profile.

    I enjoy designing and building systems that learn interesting and useful things from data.

    I’ll add this (the patent and this post) to my own post on the topic. Thanks, as always, for highlighting interesting patents to the community.

  8. Hi AJ,

    Appreciate your thoughts on this post; thank you. The way that the passage of time associated with clicks is considered within the patent is an intelligent approach to capturing feedback, but as you state avoids heavy weights applied to old click data.

    I’ll keep on trying to find interesting ones. I’m thankful when I see people putting together some intelligent thoughts on information about these patents, like you often do. Thank you for that.

  9. Hi Amel,

    Thank you for sharing your thoughts. I do ask myself sometimes how Google may try to adjust from chatter and discussion within the SEO industry>.

    I did see the Social signals patent, and thought it was worth spending some time upon. I’m glad to see someone wrote about it. Thank you. 🙂

  10. Hi Amin,

    That is a challenge; there do seem to be a lot of people who try to manipulate rankings of pages in search results. The assumptions that Google makes are subject to being abused by a number of people. 🙁

  11. I always need a big cup of coffee, a cigar and time to think about it, every time I read a post here. And that’s a reason for coming back, time and time again.

  12. Greetings Bill,
    Great write ups there.

    Should google patents click-through be a good idea to improve rankings, what measures have they put in place to curb “click bombs” by some people to falsify their rankings?
    Not disputing the fact that patent click-through is a good idea though, but
    First thing first.

    Thank you.

  13. Hi Bennet

    Google could track and identify the people who made those clicks, based upon things such as the user agent associated with the click, which would include the IP address, and might include a Google Identification number, and user agent information related to the browser used by that person. Google could look for suspicious activity from profiles related to those, to try to identify manipulative behavior on the part of people clicking upon search results.

  14. Exciting patent to find, and very interesting takeaways.

    The idea of thresholds for ranking factors is fascinating to me – the idea that certain things may not apply until after X number of users have clicked per day, or whatever. I have a background in quantum physics and it brings back vivid memories of, for instance, Einstein’s gravitation not REPLACING Newton’s theories, simply augmenting them and expanding them to apply to a wider range of situations. I wonder if the algorithms Google uses could be studied a little better using this as a metaphor.

    Also, didn’t Yandex utilize user metrics like this on a much bigger scale earlier, only to have to fight off all these spam techniques already?

    Great post!

  15. Google have fairly recently said “no”,
    and they said “no” several years ago in a hangout as well (I think I remember that?).

    But is there not another patent that looked at CTRs and Titles, and referenced changing positions based on CTR volumes that outperformed the “standard” for that position?
    (There may have been a secondary patent that also looked at CTRs too).

    Yes, it is a noisy signal – but many of them are far from clean.
    It makes sense, it’s logical, yet I can see a large part of bias occuring. Brand (be it corporate or personal) would carry a fair advantage. Personalised SERPs based on history could also be a factor. Then there is the manipulation angle. G could technically clean things up; rely on increasing thresholds, previous performance modifiers, previous search history to identify users who should be able to spot relevance/quality, utilise performance over time or dampen for initial bursts – it’s doable.

    But if they’ve said “no” ……..

  16. Hi Ethan,

    Thank you. The idea of thresholds is one to keep in mind for many signals, and when SEOs discuss different ranking signals, they rarely mention the possibility, which is probably a mistake. I’m not sure about the approach followed by Yandex; I don’t usually follow along with what takes place over at Yandex.

  17. Hi Anna,

    Google may have said that they don’t look at user clicks in response to specific queries, but in this patent tell us that its a good assumption to make that there’s some value in those clicks. Do we trust that no? I’m not so sure.

  18. Thank you for the reply Bill.

    I don’t think I’ve ever caught someone at Google lying.
    But there are a lot of ways to evade, slip, misrepresent and omit.

    May be it’s how the question was phrased, or the answer given?
    Sure, G don’t use the Clicks in the SERPs – that leaves them the option of using the lack of clicks to derank a listing, yes? If Position 5 gets 7% CTR on average, but that specific URL for that specific search only seems to get a 1%, they could demote it. And though there is Click data there – it’s the lack of clicks that does the work (so depending on how things were worded – would that avoid lying?).

    Or there is the PogoSticking. Again, this isn’t the actual clicking that is used. It’s the searchers return to the SERP that is used as a signal?

    Talking to someone at Google must be like playing that kids game with all the little face cards and you have to guess who? You kind of have to whittle away to get to the actual truth, as any answer they give can still leave a lot of possibilities 😀

  19. Google is likely using search and click data for ranking for testing SERPs or for the news / “buzz”.
    It may be use a threshold, location and the language.
    I observed this phenomenon for customers who make TV advertising (3 time : before, during and after).

  20. Google had a lot of luck this last years. They haven’t improved its technologies and, nevertheless, they have been dominating the market. If they don’t improve they service a competitor will.

  21. The most important factor in google is contents. Write the content that readers feel to share it everywhere. Write shareable contents if people love your contents then and then only google will love your content and definitely will give 1st priority to you. Also follow the ethical ways of promotions and if you need then restructure your website according to Algorithms.. And here Google Click through Feedback is single factor..

  22. I agree with Nathan. Great quality content is the best way to improve PR. That and great quality backlinks. I have looked at my competitors sites and the ones who reach page 1 PR are the one’s witht he best content.

  23. Hi Nathan,

    Thank you. Aiming at creating content that people want to share with others is a tremendous goal, and is usually worth the effort.

  24. Hi Yolanda,

    It does seem like that is very true, doesn’t it? It is worth trying to create and share things that others will appreciate.

  25. First thing that comes to my mind, how can we game this??? Idk if they use it, necessarily as a specific signal but overall its always good to think about how a positive user experience can help contribute to better rankings…

  26. Hi AdInfusion,

    I imagine that when search engineers come up with new ranking signals, one of the primary thoughts on their minds is how people might try to game it. These types of clicks seem to indicate a positive user experience, or at least some expectations of those based upon a favorable impression of a page title, a snippet, and a URL that show in search results for a site.

  27. Many of the google algorithms favour those sites that already have good SERP’s results. There doesn’t seem to be any recognition that websites have a life cycle – or am I missing something?

  28. Hi Tim,

    This particular patent focuses upon sites that are already doing well in search results. It doesn’t recognize a life cycle for web sites. You’re not missing anything about this one. It has a limited focusm, which is true with a lot of patents. There are other algorithms that focus upon other things that may affect even newer sites, and older sites. and sites without good SERPs.

  29. I can see where click through would be important for ranking, similar to bounce rate. I wonder how it all fits together. If there is a lot of click through but a high bounce rate that would probably hurt ranking…but if there was high click through and low bounce rate that would signal a good user response. Interesting that there is a patent for the technology…I thought it was part of the algorithm.

  30. Hi Cathie,

    A combination of a high click through and a low bounce rate is probably beneficial to a site owner, and is likely what Google would want to see as well.

  31. Should we use schema markup on website which can help to Boost CTR and gradually improve ranking?

    Thank You

  32. When it pertains to on-page optimization and also increasing your CTR, you need to position the essential points above the layer. Yet being above the fold is not just for on-page optimization. You additionally should be over the fold when it involves online search engine results. Absolutely this article is really appreciated.

  33. Iam always searching online for articles that can help me. There is obviously a lot to know about this. I think you made some good points in features also

  34. Thanks for sharing this information.. Google patents click-through is really good idea for get ranking, but you must know quality content is the most powerful factor for get better raking..

  35. Interesting, whatever their algorithm is, let’s hope it’s not weighted too much on the overall score. Kind of reminds me of a popularity contest in grade school.

    Content should remain king.

  36. Hi Neng,

    When people select a search result, they are choosing based upon the title of a page, a snippet from the page (or a meta description), and the URL of the page. That information is how a search engine chooses to represent the content of web pages that they choose as search result, and they highlight (or bold) the use of query terms as they appear in those. The idea behind those is to give a searcher a sense of how their query terms are used on pages that are being returned for their search, so it is based upon some amount of relevance, and is a little more so than a popularity contest. A Ranking that would be based solely upon search results shown wouldn’t be the same as results based upon the full content and links of specific pages. The actual rankings of full pages is done through a mixture of content and links, so it may be more of a popularity contest than one based upon clicks from a search result page. 🙂

Comments are closed.