How Google May Demote Some Search Results for Subsequent Related Searches

Sharing is caring!

Sometimes when you search at Google, you might not find any results that you find interesting and may search again using a somewhat similar query. Chances are that you don’t want to see the same sites or pages all over again. A newly granted patent from Google describes how the search engine might demote results for pages from sites that show up in an earlier search when they appear in your results during a following search during the same query session.

For example, imagine that you search for [black jacket] and don’t see any results that you like on the first page, regardless of whether you clicked upon any of them or not. Instead of going to the second page, you search for [black coat]. Since the queries are related, you might see results from some of the same sites in both searches, which the patent refers to as “repetitive” search results. Google may take your decision to search more as an indication that you weren’t satisfied with the pages shown in the first set of results, and may demote some of the “repetitive” sites or pages from that first query so they aren’t as prominent in the second set of search results.

So, your search for [black jacket] might show a page from the site “Winter Coats Online.” You might click upon it, or you might not. Regardless, when you move on to search for [black coat] if a page (the same page or another page) from “Winter Coats Online” would have ranked highly for that search, Google might push it down in search results so that it isn’t listed as prominently.

The patent is:

Demotion of repetitive search results
Invented by Ashutosh Garg and Kedar Dhamdhere
Assigned to Google Inc.
US Patent 8,051,076
Granted November 1, 2011
Filed: December 13, 2007


Apparatus, systems, and methods for demoting repetitive search results are disclosed. Search results that are identified in both first set of search results and a second set of search results are determined to be repetitive search results.

One or more of the repetitive search results can be demoted in the second set of search results. The demotion can be based on a relevancy threshold for the second set of search results.

The claims to the patent tell us that we might see a reranking behavior like this from the search engine if the two searches happen during a “search session,” which could be within a certain amount of time (such as 5 minutes, an hour, etc.), or by a user logging into and out of the search engine, or by some perceived relationship between the queries that a person searches for. It also appears that this would happen when someone might be logged into the search engine, though some other way of tracking searches might be used such as cookies.

It’s also possible that Google might demote only repetitive search results that have been previously selected by the searcher, rather than all of the repetitive results that might have been seen in the results from the earlier query.

We’re told that one of the things that the search engine will try to avoid is ranking demoted pages below search results that are likely to include “very little relevant content responsive to the user query.” For each set of search results, a “relevancy threshold” is calculated, based upon things like the difference in relevancy scores between the results that appear.

For instance, imagine that in a set of ten results, the relevancy scores for the first 4 results are pretty close, and then the results below those have much lower relevancy scores. The point between the fourth and fifth results is the relevancy threshold for that set of results because of the steep dropoff in relevance.

When a repetitive result appears in the second set of results, it will only be demoted if it is above the relevancy threshold. While it is a relevant result and should still rank as one, it is being demoted in favor of displaying other results more prominently, that aren’t repetitive so that the searcher sees pages from sites that they might not have from their first search.

The demoted page won’t be demoted below pages that are under the relevancy threshold.


This demotion approach may explain why you sometimes may see pages ranking at different places in search results for a particular query if you are looking at those results during different query sessions and also looking at related queries earlier during each of those sessions.

We know that Google will sometimes provide Search customizations to results based upon your search history or a location that you either might have specified with Google previously or based upon your IP address. The search history customizations might be based upon your Web history if you’re logged into your Google Account, or based upon previous searches linked to the browser you are using through tracking using a cookie.

The demotions described in this patent might be considered customizations, but they are limited to an individual query session rather than something impacted by a search history that could include many different query sessions. So, if you performed a search for [black coat] a month ago, and search for [black jacket] today, you might not have any results demoted.

The patent doesn’t describe how it defines whether or not queries are related.

One way a search engine might do that is to aggregate query information from many search sessions from many different searchers to see which terms or phrases tend to show up in query sessions together. Another way might be to determine whether or not queries might be related is if they tend to have a number of the same pages showing up in search results for them (a certain top number of results, or a certain number of results above a “relevancy threshold”, for instance).

Sharing is caring!

22 thoughts on “How Google May Demote Some Search Results for Subsequent Related Searches”

  1. From a user standpoint, this is incredibly useful. I hate it when I re-phrase a search query and it’s still shows the same pages that aren’t helpful like a Yahoo! Answers page with no useful information or an eHow article where it’s clear the author doesn’t know what they’re talking about.

    From an SEO standpoint, I don’t really see this changing things up too much. Especially if the search session is small, like 5 minutes. I’d be curious to find out how long a search session is according to Google. Anyway, thanks for the article Bill!

  2. Hi Corey,

    Thank you.

    I agree with you about the usefulness of this approach, and about it probably not making too much difference from an SEO standpoint since it would be limited to individual query sessions, and it won’t demote sites/pages below the relevancy threshold described in the patent.

    The patent defines query sessions a few different ways. One is based upon a certain predefined length of time, such as 5 or 10 minutes. Another might be the time between when someone logs onto their Google Account and then logs out. The third is a period of time when they might seem to be performing similar or related searches.

    I suspect that some kind of combination of the above might be used for a few reasons. One of them is that you probably don’t want too long of a period of time to be considered a query session, especially if a computer is shared amongst a number of people, and the informational need that sparked some queries isn’t going to be the only thing that people might search for. Another is that people often multi-task or get sidelined when they perform searches, so you may have a number of unrelated or slightly related queries mixed together.

  3. All the more reason businesses should continue to diversify property, ip & content portfolios. If applied, devising a test measuring multiple competing ad content may lead to great things.

  4. Interesting this. Maybe further sites will need developing to cover further angles so netting further terms that are searched for. Even though the products, service etc are from the same company. More gateways?

  5. Sounds like a decent feature for searchers. It stands to reason that you don’t want to see the same results a second time.

  6. Bill,

    I hate to admit it, but as I get older, when I search a lot of ways, I like to be able to go back to a prior search to find that site that had, say, the information on Yurt camping on the New River. My searching could trigger an evolution in my thinking once I see what is available. In that case, would the exact same search ten minutes later bring up different results? And if I can’t remember the exact terms I used, I might never find the Yurt place again!

    OK, so I know it’s not likely, but I wonder if they try to anticipate too much, might they outsmart themselves (or me)?

  7. When it comes to Google you can never be sure… I will wait to see if Seo will change after that new development.. Nice article…thanks for sharing

  8. If your site is optimized correctly and you have compelling title tags and description tags then this can only benefit the user and the website owner. Google is trying to find that perfect match.
    Just means we will have to create more pages and be laser focused on each page. it’ll keep my copy writers happy! 🙂

  9. Bill, I think this patent has a lot to do with Google Instant, which has changed the way people search. Earlier, there used to be shorter sessions, but with Google Instant, users might reformulate a query much more often – encouraged by the continuous display of search results in a fraction of a second.

  10. Pingback: Google Instant and a Google Patent :: Prodigal Webmaster
  11. Hi Michael,

    It’s often a good idea to diversify when it comes to your marketing efforts, to not rely too much upon any one source for traffic, and to try to make sure that you provide the best representation of your site in titles and snippets that might be shown in search results as possible.

    If you’re going to use paid search, testing the effectiveness of ad copy is essential as well.

  12. Hi Matt,

    From a searcher’s perspective, this patent makes a lot of sense. From the perspective of a site owner, I think it shows the importance of thinking carefully about the terms and phrases that you might want to optimize the pages of a site for, and to definitely not focus too narrowly or rely too much upon a small pool of main head terms, while also recognizing the importance of traffic from tail terms as well.

    There may be some value in having more than one domain where appropriate, and towards developing a strong presence in places like Facebook and other sites as ways to increase the visibility of your site and business and the products that you offer on the Web.

  13. Hi Alan,

    Good points, and I like to explore search results and not limit myself to any one when I first start searching.

    I think that’s true of many people who are looking for information, and want to find out more about a particular topic or type of product or place. For some searches, we do have a particular site or product in mind, but for many we are going to browse, learn more about that topic, compare pages and products, and so on.

    I know that researchers from Microsoft have written a good number of papers on refinding information that we’ve come across before and have developed strategies for helping us refind information. One of those might be to show the paths that we have taken in our searching and browsing histories on previous searches. Google’s webhistory is supposed to help make refinding information easier, but it’s something that they don’t emphasize much, and you might not even think about when you’re trying to find a page that you went to before.

    If you perform a search to find a particular page that you’ve seen before in search results within a short period of time, like ten minutes, that page might potentially be demoted and ranked lower than it was if you’ve performed a number of searches for related queries within that query session. But, one of the things that this patent stresses is that they will try not to demote specific relevant results too much when they do demote them.

    For each query, they will decide upon some point where there is a significant dropoff in the relevance of results shown, or a relevancy threshold. So if the first 6 results for a query are very relevant, and then there’s a dropoff for the seventh, and you perform a query on a second term that is very related and the relevancy threshold for that query is maybe between the 4th and 5th result, then any of the sites that might have been the same in your results for the 6 very relevant results from the first query, and the 4 very relevant results from the second query might be demoted, but they wouldn’t be demoted below that relevancy threshold in the second set of results.

    So here’s an example of how this might work for [black coats] and [black jackets], which could be seen as very related queries:

    Search results for query 1 – black coats

    1. site a – very relevant
    2. site b – very relevant
    3. site c – very relevant
    4. site d – very relevant
    5. site e – very relevant
    6. site f – very relevant
    ****** relevancy threshold
    7. site g – not too relevant

    Search results for query 2 (before demotion and what you wouldn’t see) – Black Jackets

    1. site h – very relevant
    2. site b – very relevant
    3. site i – very relevant
    4. site j – very relevant
    ***** Relevancy threshold
    5. site k – not too relevant

    Search results for query 2 (after demotion and what you would see) – Black Jackets

    1. site h – very relevant
    2. site i – very relevant
    3. site j – very relevant
    4. site b – very relevant
    ***** Relevancy threshold
    5. site k – not too relevant

    Note how site b was demoted, but it wasn’t demoted below the relevancy threshold.

    The purpose behind this approach is to try to make your subsequent searches more productive by offering you more variety in those following searches, but not to hide results from you completely.

    Remember too, that this kind of demotion only will happen during a query session. So if you do the second search for black jackets after a few hours or the next day or the next week, the results that you see wouldn’t have demoted results, and you would possibly see this instead:

    Search results for query 2 (without demotion and what you would see) – Black Jackets

    1. site h – very relevant
    2. site b – very relevant
    3. site i – very relevant
    4. site j – very relevant
    ***** Relevancy threshold
    5. site k – not too relevant

  14. Hi Darren,

    Of course, just to play devil’s advocate a little, if you keep on seeing the same site ranking well for very related queries, then you might decide on the second search or a later one in the same query session that it might be the best site to visit. 🙂

  15. Hi Simon,

    It’s hard to tell just by looking at search results that Google might be doing this kind of demotion and reranking. Without having seen this patent, I’m not sure that I would have guessed that they were doing something like this.

    How do you respond to it as an SEO once you know about the possibility? That’s definitely a question worth exploring.

  16. Hi Ian,

    One of the things that I like about blogging and blog posts is that you can explore related topics and issues from different perspectives and vantage points that could help you rank well for multiple queries. I do grow a little concerned though when you might try to create multiple pages that contain primarily the same content but are focused upon being optimized for different keywords that are very related. I’m not sure that’s the right way to approach something like this.

  17. Hi Prodigal Webmaster

    That’s a very interesting point. The query refinements that Google might include on a page as suggestions for followup or related searches might be less likely to be clicked upon than ones that are suggested to you within the dropdown that Google might show. And if people decide to follow those suggestions, chances are that they want to find some value in them. If Google were to keep on showing the same sites for them, people might be less inclined to look at those suggested revisions.

  18. Right. If this is to come to fruition, what is the response? And how will we be able to track it?

  19. Hi Mike,

    I’m not sure if there is any viable way to track when this is happening.

    If your pages are relevant for a particular query, and relevant for a subsequent related query in the same query session, your pages shouldn’t fall below a certain relevancy threshold if they are demoted, but there really isn’t a way of telling how high or low that threshold might be for different queries.

  20. We have to aim to be as relevant as the site b in your example, it shows up (albeit at a lower rank), after demotion.

  21. Hi Eliseo

    It doesn’t hurt to try to be as relevant for related queries so that you can show up in results for both, but as you note, if a page from your site does show up for related queries for a searcher in the same query session, they may be demoted somewhat in search results.

    It does make sense to do some research though to try to find certain related queries and optimize for them when they seem appropriate.

Comments are closed.