20 Ways Search Engines May Rerank Search Results

Sharing is caring!

This is the first part about How Search Engines may rerank search results, in what is now a three-part series, with the second part available at 20 More Ways that Search Engines May Rerank Search Results, and a third part at Another 10 Ways Search Engines May Rerank Search Results. It may be time for a fourth part soon. (Added 2013-08-31)

Search engines try to match words used in queries with words found on pages or in links pointing to those pages when providing search results.

Often, the order that pages are returned to a searcher is based upon indexing of text on those pages, text in links pointing to those pages, and some measure of importance based upon link popularity.

Before pages are served to a viewer, however, a search engine may rerank search results for one reason or another. Here are some possibilities:

1. Rerank search results by filtering of duplicate, or near duplicate, content

Search engines don’t want the same page or content to fill search results, and substantially similar pages may be filtered out of search results. While not technically not a reranking of search results, as Dr. Garcia notes in Search Engine Patents On Duplicated Content and Re-Ranking Methods, this type of filtering has the result of changing the order in which results are returned to a searcher.

2. Rerank Search Results by removing multiple relevant pages from the same site

It isn’t uncommon for more than one page from a site to be relevant to a search query. Search engines try to limit the number of pages displayed in search results from the same site. If there is more than one page from a site that ranks for a search, a search engine may show a second result from that site after the first result, indenting the second page, and inserting a link to “more results from this site.” Additional results may not be shown.

3. Rerank search results based upon personal interests

A search engine may try to rerank results for a search to a specific searcher based upon past searches and other tracked activity on the web from that person. This kind of reranking may rely upon a person logging on to a personalized search. Here are a few different looks at how that might be accomplished:

4. Rerank search results based upon local inter-connectivity

The search engine may grab results, and then reorder the top N (e.g., 100, 1000, etc.) search results based upon how they link between themselves.

Here’s a variation of that method:

5. Rerank search results by Sorting for country-specific results

A searcher may wish to see results biased towards sites coming from a specific country. Someone could explicitly choose a preference for a specific country, or the system may try to dynamically understand such a preference based upon IP address. The following patent application explores methods for reranking based upon country preferences.

6. Rerank search results by sorting for language-specific results

Preferences regarding language may be set by the user in a browser, or through the search engine, or may be identified by the search engine while looking at the search query, the user interface, and characteristics of the search results. Here’s one look at how results might be modified if a preference can be identified:

7. Rerank search results by looking at population or audience segmentation information

This method may look at things such as location, other individual demographic information, and information about groups that a searcher is associated with to help rank pages. Technically, this may not be considered a reranking since it doesn’t modify an original set of results, but there are baseline rankings that are altered by differences in populations.

8. Rerank search results based upon historical data

Involving the age of documents, and of links to documents, and other historical data, pages can be reranked based upon a large number of time-related factors. This patent application from Google contains a laundry list of those:

9. Rerank search results based upon topic familiarity

Looking at pages for things like reading levels, use of stop words, and other textual features. A patent filing from Yahoo! that describes one way to do this, allows searchers to use an interface to choose introductory results and advanced ones, and a few degrees between:

10. Rerank search results by changing orders based upon commercial intent

Similar to the method described above under the use of a slider, Yahoo! Mindset (no longer available) lets users determine the reordering of results based upon whether they want to see results that are more commercial in nature or informational.

11. Reranking and removing results based upon mobile device friendliness

Microsoft provides a way to serve pages that display well in mobile devices and discard pages that aren’t from search results:

12. Rerank search results based upon accessibility

Google recently came out with a specialized search that reorders pages based upon accessibility in their Accessible Web Search for the Visually Impaired.

13. Rerank search results based upon editorial content

A granted Google patent describes reranking of search results based upon whether or not certain pages have been determined to be favored or unfavored.

14. Reranking based upon additional terms (boosting) and comparing text similarity

This Google/Berkeley document describes reranking of results for a news search by considering and adding additional query terms, and by looking at document similarities.

15. Reordering based upon implicit feedback from user activities and click-throughs

There have been a lot of papers and patent filings that describe the reordering of search results by looking at user behavior and query selections. Here’s one that describes looking at different queries over user sessions:

  • Query Chains: Learning to Rank from Implicit Feedback (pdf)

16. Reranking based upon community endorsement

A number of documents refer to the use of collecting information from a large number of searchers or users of social networks. Here’s a small sampling:

17. Reranking based upon information redundancy

Word probability distributions from a set number of results try to identify different topics that may be covered by a query, and this method could be used to try to show a diverse set of results based upon those categories.

Utilizing information redundancy to improve text searches

18. Reranking based upon storylines

This document from IBM takes search results, and reorganizes them into storylines which it expands upon in some ways, and filters in others, before presenting those storylines to a searcher

19. Reranking by looking at blogs, news, and web pages as infectious disease

An analogy is used to disease-propagation models in this IBM patent application to describe how segmentation into topics paying attention to time-based changes and additions to those topics in the blogosphere and on bulletin boards might tell a search engine which topics and terms are popular, and where information about those might be located. While the process is described in the context of providing news-based alerts, the concept could be expanded to help with the reordering of search results based upon measures of popularity and burstiness (for instance, in the next section.)

20. Reranking based upon conceptually related information including time-based and use-based factors

In several ways, this next patent application describes a process similar to the last two methods listed. It involves grouping together concepts and looking at how those change over time and how different people participate in those changes. One of the co-inventors listed is Apostolos Gerasoulis, from Ask.

Conclusion

The rankings that you see for web pages in response to a query may not be the same rankings that other people see.

This isn’t a comprehensive listing of documents or processes that describe ways search engines may rerank pages, but it covers a lot of different possibilities. Some of these reranking processes are being used now, others may be in place, some may be used in the future, and a few not used at all. But chances are good that even more processes for changing the orders of search results will come about in the future.

For each of these methods of reranking, there may be ways to make sure that the pages of a site will continue to rank well even if search results for different users are reordered. How would you go about addressing them?

Sharing is caring!

105 thoughts on “20 Ways Search Engines May Rerank Search Results”

  1. Thanks, Cristian.

    I’ve been thinking about these for a while, and thought it might be time to put them into a blog post.

    Thanks for the Digg, and for linking to Digg in your comment.

    I think that if people start understanding some of the implications of these reranking processes, it might result in higher quality web sites – many of them focus upon rewarding well-rounded information based sites.

    For instance, addressing the one on topic familiarity in a site might mean including separate pages containing information both for people new to a topic, and for people with advanced knowledge of the same topic.

    Cheers.

  2. I’ve been thinking about these for a while, and thought it might be time to put them into a blog post.

    Bill, it just shows how much effort, time and research you have put into this post.

    For instance, addressing the one on topic familiarity in a site might mean including separate pages containing information both for people new to a topic, and for people with advanced knowledge of the same topic.

    Not just that, but most people (usually our clients) really need to understand that age is a pretty important aspect of web marketing and seo, and that no web agency in the world, no matter how professional or experienced the staff is, can overcome it.

  3. Good one. Age is a major issue these days. A client is planning on expanding his business activities next year, with a new site.

    Based upon that patent application, and even without it, following a strong marketing plan means it makes sense for him to register the domain for the site now, and put information on the pages of the site that people find helpful and valuable enough to link to.

  4. Exactly. Register now, start the business next year. But just put a spash screen. Or at least let the whois info run wild for 365 days.

    We don’t know for sure that Google and the friends check whois data and count it as a rank prerequisite, BUT, it’s always good to be safe.

    Lately, people are starting to register domains for more than a year (4-5 etc) because of the recent discussions about Google considering one year registered domains as not so trusted.

    Stories, I know, but it doesn’t hurt to know your industry, rumor or not.

  5. I have mixed thoughts about whether any of the search engines are looking at, or are considering the age of a site as shown in whois information.

    But having a business start conversations with potential clients, giving those visitors useful information, and building a presence on the web makes sense even in their prelaunch days.

  6. Very nice Bill!

    I have bookmarked it and will study it further and link to it later.

  7. This is a great list, Bill – it really conveys a grasp on the reasons that site’s shift constantly in rankings.

    Some of these methods are fascinating – IBM’s “storyline” patent suggests a fantastic use within blog or news search: tracking the development and current state of a particular theme over time, etc.

    Thanks!

  8. Pingback: 20 Ways Search Engines Rerank Results
  9. Pingback: » 20 Ways Search Engines May Rerank Search Results
  10. Thank you, fantomaster.

    The process is becoming more complex, but one of the things I hope that people will focus upon here isn’t so much the complexity of the patent filings and papers that I pointed towards, but rather the concepts that search engineers and academics are writing about on how to come up with more relevant results for searchers.

    I’d urge anyone digging into any of the linked patent applications above not to take it as the absolute direction that a search engine may be traveling towards, but rather a possible goal that they may be attempting to reach, and that site builders can anticipate in the creation of their sites.

    There is value in making a site more accessible, in making it display well on mobile devices, in providing introductory and advanced information on topics the site is concerned about, in writing about topical and popular subjects that potential visitors write about and search for.

    These possible reranking methods aren’t hurdles to jump through, but rather potential opportunities – addressing them can make a site richer, and potentially more successful at meeting its goals.

    Search engines are changing and evolving in some interesting and exciting ways, and SEO probably has to do the same.

  11. Bill,

    I met you only once, but your posts are “required reading” in my online research. This one is outstanding. You are the “new Bill” in my tech world.

  12. Great resource, Bill, thanks for this.

    Besides pointing people to lots of different ranking approaches available to the search engines, it also illustrates that the overall process is turning ever more complex – something that’s often overlooked by amateurs (corporate and private alike) following all those simplistic strategies widely flaunted across the Web.

  13. Thanks, Liam.

    That particular issue has been one I’ve been pretty aware of lately, too. Some of the results can be drastically different. It does make life interesting.

  14. Excellent post Bill. Thank you. It really helps to keep all of these concepts in mind when planning for the future. Point 5 is of particular interest to me at the moment, as depending on what search you are doing, the top 10 results can vary quite a lot between google.com and other country specific google domains.

  15. Pingback: ReveNews - Carsten Cumbrowski
  16. Rock on Bill!

    This is insight into the mind of the AI concepts that the engines are using to provide positive experiences for users.

    Hey, what do you know, we actually have to provide a balanced approach and not a flash in the pan, here today, gone tomorrow (at least until they figure out how to filter me out of the SERPs). This is wonderful news, even if I haven’t looked at all of the changes that have been coming.

    Thank you, Bill, for your insight and ability to bring it all together. You are a gracious fortune teller with the insight, should we choose to use it!

  17. Pingback: Unofficial SEO Blog » Bill Slawski On How Search Engines May Rerank Results - SEO Information much before its official!
  18. Pingback: 20 maneras de reordenar resultados
  19. Pingback: The 9 Personality Types of a Comment Spammer
  20. Dear Bill,
    I am teaching a class on advanced research techniques. This blog entry is so fascinating for anybody who considers why a search fails or succeeds on varying search engines. I see most of your commenters are search engine optimization specialists. This is a different group who will be excited to read your careful analysis! Bravo! and thank you!

  21. Thank you, Library Maven.

    My days as a law student helped inspire my passion for learning about information technology and search engines.

    Between shepardizing cases and statutes and spending far too much time in Westlaw and LexisNexis, I found that I enjoyed not only an easy access to information, but also a desire to learn more about how those systems worked.

    It is amazing to see all of the experiments, and potential approaches that the search engines are attempting to use to bring meaningful information to people.

    If you come across anything that you have questions on. please feel free to ask.

  22. Thank you very much for this article! I think everyone in the “search business” is wondering what the next step will be. I think it will not be possible to make any big changes in the actual results because that woul be a bad user experience. Maybe this will be an opportunity for new engines?

  23. I’m not sure, Chris.

    We see some of the search engines launching these types of changes independently of their main search, such as Yahoo’s Mindset and Google’s Accessible search. Mobile friendly searches are geared towards mobile users, and not desktop viewers.

    A number of papers and patents from Microsoft, and statements about Big Daddy from Google describe infrastructure changes to the search offerings from those companies.

    The Microsoft documents describe some aspects of their infrastructure changes as an attempt to make ranking processes modular, so that different ranking algorithms can be moved in and out quickly, or applied to different categories of sub-categories of certain types of searches.

    It’s possible that was one goal behind Big Daddy, too. Google has shown us that it can blend results from more than one type of search into search results pages as it provides query revision suggestions at the tops of search results, and in the middle of search results, and also provides information from Google local and other databases.

    Drastic changes in results may send off warning bells to users, yet a slow evolution and integration of other results may be more welcomed.

  24. At the other hand, if the end results provide a better user experience the changes might be less slow.

    I think you’re right, there. We’ve been hearing about Google quality assurance testers looking at pages in response to queries, and some Google folks have been participating in some of the big usability conferences. The idea of studying user queries and query sessions appears to be more common, too. I still think that movement will be cautious, but changes in rankings of pages in some segments might happen more quickly than in others.

    I’d be interested in seeing that study.

  25. Thank you for your reaction.
    I agree that it will be a slow evolution. At the other hand, if the end results provide a better user experience the changes might be less slow. I like the idea about reranking, it might be easier to get better results on very competitive keywords if you find an niche.

    I read the results of a study done in the UK that showed that search is costing more and more money. The time that is lost viewing not relevant information has spectacular rissen in recent years. From that point of view, reranking can be a very usefull help.

  26. I must admit as a search engine marketing company we seem to always be looking for ways to get listed, move up, etc. We keep forgetting to consider some of the flip-side aspects.

    In particular, I think item 15 is often overlooked, but goes hand-in-hand with good content. It also support the case for spending some money on promoting your site, as traffic does seem to help the rankings.

    Lasty, we do need to remember that not every person will see the same results for the same query (even at the exact same second).

  27. Excellent well written post, getting a lot diggs as well. Now what would be really interesting is to do some test and try to figure out what types of queires trigger which types of reranking. I’m sure Google dosen’t rerank the results the same for every search.

    For example It would be great to figure out exactly (or as close as we can) to what types of searches(or maybe specific keywords) trigger a “commercial intent” rerank.

    P.S. What do you I have to do to get included in that beautifully long bloglog of yours?

    (No answer required, just indirectly stating that I hope to gain some links from SEO experts like yourself as my online sphere of influence increases.)

    GREAT post!

  28. Hi Solomon,

    It’s not easy to tell what might trigger some of the reranking processes. There are probably more in place than the ones that we know about, and it may be possible for the search engines to change algorithms with every query, and use different reranking approaches in different classifications of searches.

    I have added your blog to my RSS reader, and every so often I look at the blogs that I’m reading and decide whether or not to add them to my blog roll. It is getting to be possibly too long as it is though.

  29. Can’t believe I missed this post til now. It’s so good I couldn’t stop smiling as I read it. Thank you for sharing it.

    Eric

  30. Wow, learned a lot by this post…hm…have to print it out and study it more close…

    Thanx anyway for posting.

  31. And this my good man is an example of why I was paranoid to see you walk through the door for my session on personalization in San Jose. 😉

    Awesome post Bill. Thanks, as always, for the continued insight.

  32. Pingback: SiteMost’s Weekly Blog Recap 22/09/07 at Brisbane SEO Blog
  33. One year registered names not to be trusted? That’s got to be rumor. I think Google may be saying that, but I don’t think they will do anything about it directly. I think what they’re trying to do is change their algorithm to scan the content more human like (which is what they’re called Latent Semantic Indexing.) I’ve seen some of it in action, and it seems to work very well. It scans the keyword search, and then produces results that not only match the keyword but that RELATE to the keyword as well. This means that you can start putting content on your site that doesn’t have to be ‘keyword rich’ but only ‘content rich.’ Good for us all, I think – even SEO experts.

  34. Pingback: The Best Internet Marketing Posts of 2006 » techipedia | tamar weinberg
  35. One year registered names not to be trusted? That’s got to be rumor. I think Google may be saying that, but I don’t think they will do

  36. We have been having a lot of questions regarding rankings of web pages from our clients. Your article has helped us understand how things have been going on. You saved us.

  37. Bill

    One of the best posts of 2006.

    Thank you so much for sharing your wealth of knowledge with people like me, so we quickly sort through all the noise and stand on the shoulders of giants 🙂 Knowledge sure is power.

    Please – don’t ever stop writing.

    Your definitely one of the top 3 authorities in the SEO world.

    Kind Regards
    Aidan Rogers

  38. @ sharp aquos : How older your website is, how more Trustrank you will get. This is a good addition for your SERPS.

  39. If you did a revamped of your whole site in terms of looks and design, which factor does that belong? How about if you have added more pages?

  40. Hi magicinmarketing,

    A redesign, the adding of new content, the changing of content on a site; the addition of new links, the removal of old links, moving to a new host; these are all things that might send signals to search engines that something has changed.

    Some of the things that I mention in the list above may be triggered by a redesign, for better or worse – like a site that is being redesigned for better accessibility might see some benefit in a search engine that shows a preference for accessible sites. A redesign that uses a CMS which might cause the introduction of duplicate content could have a negative impact.

  41. Very well explained article. But It is slightly confusing as to the no. of aspects which are considered before a page flashes on our computer screens. The main aspect is how well your keyword is cashed and the backlinks to it. I guess that brings you to the top 10 list.

  42. Hi Eva,

    One of the reasons that I wrote my two posts on how search engines may rerank search results was to get people to consider that there may be more going on behind the scenes in determining the order of search results than we may think about. I’m not sure that we can decide upon a “main” aspect any longer, if we ever could in the first place.

    Backlinks do hold some value in a link-based indexing system, using something like PageRank, but there have always been other considerations, such as the relevancy of the content of a page. We should expect more signals in the future rather than less. It might be time to do a third post on this topic soon.

  43. Hello Bill. Your article is very informative and useful. I don’t know about search engine rankings until I’ve read your blog. I check your site regularly but I seldom leave comments. But this time, I want to comment in this article because I enjoyed reading it. Thanks again, Bill. I’m looking forward for your next articles…keep ’em coming!

  44. Pingback: Scambi Links e cosa ne pensano motori di ricerca? | Seo Point - Posizionamento e SEO
  45. Good article, a lot of us don’t think about how things are re-ranked, just how they’re ranking today. I’ll definitely file this away for reference in the future!

  46. Thank you, Joe.

    I think we’re seeing more and more re-ranking of search results, and will continue to do so. Google has been leaving a message at the top of many results that I see telling me that they have customized my search results based upon previous searches that I have performed, or based upon my location.

  47. Recently I read a lot to talk about changes in Google SERPs. And what is shown is that to achieve a good rankings you must also have a variety of backlinks from a lot of different domains.

  48. Hi otimização de sites,

    Having links to your pages from a variety of different domains can be helpful if those links are of decent quality. High rankings can be acheived from a single link if it is of high enough quality though acquiring more can help mitigate the risk that a search engine may miss that link, or that it might disappear over night.

  49. Cool article, very nice, thanks. Nowadays we need to focus on the real deal when comes to marketing and well explained information is always welcome. Thanks again. Best regards.

  50. I don’t think this list is in order of importance but if it was I would rank duplicate content as no. 1. There are so many websites that use information that is already on the Internet. Unfortunately many new owners of websites do not know that this can be a problem when they are trying to rank in the search engines.

  51. Hi Atlanta,

    Thanks. It is a problem that many new site owners may not be aware of, and one that they definitely should be. I don’t think that I could even begin to list these possible methods that a search engine might rerank results in any particular order. I agree with you completely that the filtering of search results could have a substantial impact upon sites. I think that’s true of many of the others as well, though.

  52. Duplicate content is said to be a myth …on first hand knowledge i have seen sites apply duplicate content and within hours been moved down 2/3 pages and sometimes completely dissapeared.Even when outsourcining content it is best to check via copyscape or like before running the risk of googles wrath.

  53. Hi David,

    Thanks. That’s a good point, worth considering. A recent video from Google called the idea of a duplicate content penality a myth as well, but there are still issues around duplicate content that can hurt a site.

    1. Google will filter out pages from search results that are duplicated on the same site, or on different sites. There’s no guarantee as to which page, or which site will show up in search results. Sometimes the “original” won’t be the one that appears.

    2. A site that might be substantially similar to another one, in content and linking structure, might be considered to be a mirror site, and may not be crawled by search engines.

    3. The distribution of PageRank might not be set up in the best fashion for a site that may have more than one URL for the same pages, because of canonical URL problems. In some dynamic URL writing systems, it’s possible for a single page to show up under hundreds or even thousands of different URLs. The most I’ve seen for a single page was about 15,000 listings in Google’s index for the same page (under approaximately 15,000 different URLs. More commonly, pages might be splitting PageRank between a version of the page with a “www” and without that “www.” That doesn’t help with rankings.

    There may not be a “penalty,” for duplicate content, but there can be some serious repercussions. The Google video also distinquished between a penalty for duplicate content, and a penalty for “spam” where duplicate content was involved in the creation of web spam pages, as well.

  54. People will notice this a lot lately with google, that when signed in, you get some more relevant results based upon search history and your ranking of sites history. You can now bump results up when signed in, leave comments on sites, and I am guessing they will eventually use this data to rerank results on an individual basis. I am looking for that in the next year especially, as live and personalized searching becomes more in demand.

  55. Hi Aaron,

    I’m not sure that the results are more “relevant” when you are logged in. They might be more in line with your past searches and browsing history, but basing future results on past actions can be a big mistake. To paraphrase Nissim Nicholas Taleb, the hand that fed the turkey the past 1,000 days may be the one thar wrings its neck tomorrow.

  56. Hi Emin,

    Thanks. A patent filing may prevent other search engines from using a reranking method in the way that it is implemented and described within the patent, but I have seen many patent filings from different search engines that come up with what look like very similar results following processes that take different paths to get to those results. That includes many of the reranking methods that I mention above in my post.

  57. Wow, this article was written back in 2006 yet the content and advice is still relevant today. That is a rare find indeed. Well done, Bill.

    Reranking based on aging links seems to be happening to many of our clients. Something to consider…SEO is a marathon, not a sprint.

  58. Hi Shell,

    Thank you. I’ve been seeing these kinds of re-ranking features being referred to as “filters” more frequently since I posted this article back in 2006.

    Interestingly, the age of links pointing to pages may be a positive feature for some types of queries, and possibly a negative feature for others.

  59. Yeah, it’s pretty amazing that this was written back in 2006 as Shell said. It’s like you could see the future or something 🙂 Many of these have actually take a foothold in today’s search world.

  60. How prophetic is this post! Especially the personal preferences section. The face of search has really changed the last four years.

  61. Thanks, Jason.

    Search has changed over the past four years, and it will probably keep on continuing to evolve. it definitely makes things interesting.

  62. I am seeing that age being a large part of the algo after having done some split tests with two domains that I own. While this is important ultimately links and relevance of the sources of them is what I use to guage the effort involved.

    Interesting post Bill.

  63. Hi Geoey,

    Thanks for sharing your experiences with us.

    Your experiment sounds interesting. I agree that links and how relevant the sources of those links are definitely one of the areas to focus upon.

  64. Hi Bill, I’ve been working with an eCommerce company recently, and we’ve noticed something interesting occuring since around mid-November. For one major commercial search term related to the Christmas period, lets say ‘Widgets Christmas’, the first 3 organic rankings have been rotating fairly evenly between their own site and the sites of two competitors.

    Every 2-3 days the order changes, so the 3rd placed site moves to 1st place, 1st to 2nd, 2nd to 3rd. It is as though, for a major commercial term linked to the Christmas season, the top 3 places are being ‘shared out’ between the 3 websites. This could be (and probably is) pure coincidence, based upon new links secured by 3 competive websites being found at random intervals, but it does seem to be almost predictable.

    Are you aware of any patents that might imply tests are taking place? As I say, I’m sure it is simply down to intensive, competitive link building activity taking place, but I just thought I’d flag it up in case there’s something more behind it.

  65. Hi David,

    There are a number of reasons why rankings might change around in the manner that you describe, from changes that you make to your website, to changes that your competitors make on theirs, as well as changes in the search algorithms themselves. A reranking approach that looks at things like how many of the top (10, 100, 1,000) sites link to each other, and boosts some sites based upon those links could provide different results everytime you search if there are new sites in the top 10, or 100, or 1,000.

    I have written about one approach from Google that I’ve seen echoed in a few different patents from them that relies on collecting a great amount of information about how people interact with search results and pages, and can in effect look like the search engine is doing a great amount of split testing. See: Improved Web Page Classification from Google for Rankings and Personalized Search

    One basic premise behind that approach is that the search engine collects information about users (and groups of users, organized by common interests, geography, and possible other factors, so that a user may belong to multiple groups), queries, and web pages. That process develops profiles for those users, queries, and webpages, based in part upon information collected about how a user (u), acts when using a certain query term (q), and interacts with a specific page (p). Information from those interactions, or instances based upon the (u,q,p) may be used to influence the search results of others.

    There are lots of combinations of instances that can be seen when looking at searching behavior in this manner. Some examples:

    Does a certain searcher pick a certain page after searching for a specific query?
    If he or she does, how long do they dwell on that page?
    If that user choices that page for that query do they then go and refine their query?

    This can end up being a little like the Amazon approach to recommending books or other products: People like you who viewed this book also viewed these other books.

  66. Hi Bill, many thanks for the detailed response. I’m off to make a cup of tea before settling down to read your other recommended post. Best regards, Dave

  67. Hi Wolter,

    We’ve definitely been moving more towards the search engines looking more at social activity and personalization, and probably will continue to do so. There are so many potential reranking (or filtering) approaches that the search engines can potentially use to boost or lower the rankings of pages that it can be difficult to understand in many cases why some pages rank where they do. Guess that makes things interesting.

  68. This is great information! Is most still relevant today? 2011? As far as geographical results coming through nowadays, You need to be in the top 3 or you could be pushed back to page two, scary for some website owners.

  69. Hi Brett,

    I think there’s a good chance that many of these reranking/filtering approaches are still relevant today, though the important point of this post isn’t so much about the individual methods themselves, but rather that the search engines have been developing a number of ways that sort and rank results, and that many people performing the same searches can see very different results.

  70. Pingback: 4 Search Engine Ranking Factors | Ignite Research
  71. Hi Bill,

    What’s your view on (and how do these compare to) the SEOmoz ranking factor list? Any factor you feel that are missing or understated?
    All the best, Harry

  72. Hi Harry,

    The SEOmoz ranking factor list is a different type of beast completely. In the SEOmoz ranking lists, Rand came up with a number of possible signals that might influence search rankings and asked a number of SEOs to rate those, and provide opinions on them (I’ve done that on a couple of those). It’s more an opinion survey than anything else. I actually would have liked it better if Rand asked us to come up with our own lists of signals, and our opinions about those – it possibly would have been a much stronger set of lists.

    What I wrote about in this post involves special filters or reranking methods that search engines might use after they’ve come up with rankings for pages based upon the kinds of signals that the SEOmoz surveys discussed.

    For instance, Google might rank pages for different queries based upon the kinds of signals that the SEOmoz survey covers, and then it might change the orders of pages in those rankings based upon the Country where someone is located, or whether or not they had personalization turned on, or for a number of other reasons.

    I’ve written about a lot of reranking approaches on this site, and wrote two other posts that compiled some of these reranking methods at:

    20 More Ways that Search Engines May Rerank Search Results

    Another 10 Ways Search Engines May Rerank Search Results

  73. Hay Bill,

    First let me say that I have enjoyed reading your Blog. I’ll tell you what – as times have changed it is just amazing how rapidly the SEO landscape is changing. The interesting thing about your topic here is that Google seems to be blatantly leading the way with making frequent changes. The funny part about all of the changes is that they seem to have penalized as many “Good” sites as they did “Bad” sites. In many cases it would even seem as though the search engines are now favoring poor content in some cases. It also seems like there is always some “Expert” building a strategy to prove the basics wrong and somehow they are slipping through the cracks.

    Chris

  74. Hi Chris,

    Thank you. SEO is a constantly changing endeavor. According to a number of different Google spokespeople (the latest being Google research head Peter Norvig), Google averages a couple of changes to its core search algorithm a day.

    I think those changes often tend to be small ones that don’t impact a lot of sites, but sometimes they do have a major impact.

    I don’t know that those changes harm as many good sites as they do bad ones. Matt Cutts recently pointed to spamclock.com, which tells us (today) that “Every hour 1 million new spam pages are created.”

  75. Even though this is an article from 2006, it’s very interesting to read that it’s still very relevant in most ways, especially the personalization of web search. Now, with Google+ being integrated into the “Google experience”, we might one day reach a point where the SERPs have become entirely personalized. That would be a major hassle for SEO specialists. Ten years ago, search engine results were rock solid. If you were ranking in first or second place, you were sure to stay in that position for a long time. Nowadays you see daily shifts, and with the influence of personalized results, I cannot imagine that search results will stop being on the move ever again.

Comments are closed.