Evaluating the Relevancy of Search Results Based upon Position

The purpose behind SEO isn’t to outrank every other site on the Web for certain queries. The purpose behind SEO isn’t to draw large amounts of traffic to a web site.

Rather, the purpose behind SEO is to make it easier for people to find a site that they are interested in, that offers what they are looking for, and that meets some informational or transactional need that they might have.

Ranking number one in search results isn’t always the best place to be. Sometimes it’s better to rank number two, or even a little lower, especially if someone visits one or more of the sites above yours, and sees that those sites don’t deliver what you offer.

Case in point, a site that I’d been working on for years had been trading places between the number one and number two position in Google’s results with another site for a very relevant query term. When the site was at the number two position, it tended to get many more conversions from visitors than when it is at the number one position.

Both sites are very relevant for that specific query term. Both sites fulfilled visitors informational needs. But the other site didn’t actually provide services based upon that information, while the site I was working with did. Being number two seemed like a good place to be.

How do search engineers feel about relevancy and ranking in search results?

It’s possible that some may be re-evaluating some of their assumptions on the topic.

The position of search results is something that researchers evaluating the quality of those results spend a fair amount of time upon. One model that looks at the relevance of search results compared to their ordering in those results, is known as discounted cumulative gain (DCG). A set of search results is good if more highly relevant search results appear higher in those results.

A whitepaper published by researches from Yahoo! and Google, Expected Reciprocal Rank for Graded Relevance (pdf), discusses how to evaluate the results that a search engine delivers to searchers in response to queries, and provides an alternative metric to evaluate search results to replace the concept of discounted cumulative gain.

We’re told early in the paper that:

One serious issue with DCG is the assumption that the usefulness of a document at rank i is independent of the usefulness of the documents at rank less than i.

Recent research on modeling user click behavior has demonstrated that the position-based browsing assumption that underlies DCG is invalid.

Instead, these studies have shown that the likelihood a user examines the document at rank i is dependent on how satisfied the user was with previously observed documents in the ranked list.

Very interesting paper, worth spending time with if you’re concerned about how search engines determine the quality of their search results (and/or if you’re concerned about how well your site might rank in those results for specific queries). There’s some math to struggle through (or skip), but it’s worth exploring the ideas and tidbits of information in the paper regardless of your mathematical literacy.

For example, the paper asks readers to decide which is the better set of search results: two sets of results where:

  • the first has a “perfect” result as the first page listed and then 19 bad results, or;
  • a second list of results where all of the results are very good.

As an aside, the researchers tell us that the only “perfect” results are pages that are the “destination page of a navigational query.” See Redefining Navigational Queries to Find Perfect Sites for a discussion of how search engines might attempt to find perfect resulst for navigational queries.

The answer, according to the researchers?

The list of results with the perfect result at the top is probably better, because its likely that no one looking at those results will go past the first result.

With the second set of results, with all pages being very good (or relevant) results, a searcher would have to spend a lot more time looking through most or all of the listings to get the same amount of information.

Conclusion

In my initial example above, about the two sites that often traded places for the first and second listings in search results for a specific query, both were very relevant for the query term they ranked well for, but one was a government site that described but didn’t offer services associated with the query. Many searchers were interested in the information that both sites provided, but also wanted to pursue those services.

When the government site was listed first, and likely visited by searchers, the information it provided made it more likely that people would visit the other page next in the search results and act upon that information. When the non-government site was listed first, it was more likely that people visiting it were still in an information gathering mode than a transactional one, and were likely to visit the government site to learn more before engaging the services of a commercial site.

It’s interesting that the researchers involved in this whitepaper tell us that some pages are “perfect” pages because they are the ideal destination for navigational queries, but then ignore that the intent behind a search might also be informational or transactional in nature (see: A taxonomy of web search for a discussion of the differences between navigational, transactional, and informational queries).

Searchers may have different intentions when they perform searches, including learning more about a topic, comparing what different sites have to offer, and performing a task whether it’s downloading something or buying something. Those intentions may change, for instance from information gathering to performing a transaction, even within the same set of search results resulting from a single search.

A searcher may travel further down a list of search results when such a change takes place. When they do, it isn’t necessarily bad to be number two, or even further down in the results. Sometimes ranking number one for a query isn’t always the place you want to be.

Share

38 thoughts on “Evaluating the Relevancy of Search Results Based upon Position”

  1. Hi alekssad,

    Unfortunately, that’s one of the biggest myths of SEO. Meaningless traffic, regardless of the volume is still meaningless. An SEO should make a positive difference in fulfilling the objectives of the site they are working upon, and if all they do is drive lots of traffic, without meeting the actual objectives of a site, then they’ve failed.

  2. I’ve always said that ranking #1, while definitely good, is not always possible and its easier to make your site convert better, than to do all the work to get to #1. I notice that when I search myself, that I usually look at the full top 5 before picking one to click on. Good post Bill!

  3. Hi Aaron,

    Thanks, that’s a good point. Increasing conversions, whether it’s making more sales of goods or services, having visitors make more donations or sign up to volunteer, upping the subscription numbers for a newsletter, having more downloads for some software, or in other ways is always an effort worth making.

    Ranking head-to-head for very competitive query terms against competitors who are very well established can be risky as well. If Ben & Jerry started and stuck with Strawberry, Chocolate, and Vanilla as their main flavors, they probably wouldn’t have gotten very far.

    Ranking number one can be good, but there are times when being a lower in rankings can be very good as well.

  4. Wearing my marketing hat, I have to worry about: 1. New User Acquisition, 2. Conversion of the visitor from landing pages, 3. Retention of users, 4. Referral from Users.

    But as far as output from my SEO campaign, all I care about is new user acquisition. So drawing large amounts of traffic is indeed a key component. Being #1, or #2 on search results doesn’t really matter. I would much rather focus on other components of the marketing plan.

  5. So the take-home? Create that transactional site in a way that provides the highly relevant information searchers require before they go into “transact-mode”?

    I think there are so many dependent variables in here that it’s impossible to determine an optimal course of action, and of course slightly different searcher intent will result in very different outcomes. The nature of the product also comes into play… I could go on.

    The only real take-home is that online interactions of all descriptions are so incredibly complex when you start digging :)
    The fun never really ends does it Bill?

    Rgds from a very rainy Phuket
    Richard

  6. Hi Sumit,

    I like your approach. Focusing upon traffic is a much better obsession than focusing upon rankings. Of course you want traffic to your site, but you want traffic that will be more likely to bring people who are interested in becoming users of your site. That qualified traffic means it’s more likely that you will have higher conversion rates, retain more users, and have them refer others.

  7. Hi Richard,

    Providing information that can help visitors make an informed decision is an approach that can work well. :)

    Online interactions can be incredibly complex, as you note, with the same searcher possibly being intent on information foraging, or comparison shopping, or within a transactional mode, and even as a returning customer. In each of these states, a searcher could view the same set of search results from a completely different perspective with some results being more attractive (and clickable) than others. I like the point that the paper makes, that basing a prediction of how likely someone will click on a search result based upon the position of that result is too simple an approach, and shouldn’t be considered independently of the quality and content of results that might appear above each result. Factoring in different possible intentions from searchers, regarding those results makes things even more complex.

    It does make things interesting.

    The forecast for Phuket doesn’t look good over the next few days.

    Thanks.

  8. You need to do your research into your competition thoroughly then. Must be quite difficult to rank for a certain spot tho. At least with #1 you cant get any higher.

    Really is about converting your traffic. Quality not quantity!

  9. Hi Bill,

    If only some would stop chasing ranking and traffic metrics obsessively. No point getting huge amount of traffic with a high bounce rate and most importantly not resolving what the visitors are actually looking for.

    And showing meaningless beautified traffic data charts without having a clear understanding of what the business and visitor goals are in relation to revenue and cost.

  10. Hi Dan,

    Good points. Researching competition is always a good step to take for any marketing plan, regardless of whether or not it includes SEO. If it includes SEO, sometimes the sites and organizations that you might think are your competition may not actually be.

    I understand what you are saying about ranking for a specific spot, but it can be easier to rank number two than number one. :)

    Agree completely on quality of traffic rather than quantity.

  11. Hi Deric,

    Yes, it can be easy to get caught up and obsessed with traffic and rankings. Every campaign should ideally focus upon understanding the goals of a site and its visitors, and work from there to meet those goals. It’s really nice to see traffic rise and bounce rates fall while conversions increase.

  12. Wait a minute; should we not encourage math/science education. They should read through the math parts in the paper ;)
    Sometimes I wish that certain posts would not rank well in the SERPs. When I noticed that the natural gas provider here was offering financial assistance to buy back-up generators, I wrote a post filling in information which they did not provide, but someone in my profession was well able to inform people of. This caused many people to call me thinking that I was responsible for the program. Even though my post explained that I was not involved. Some bad PR developed, and I was only trying to be helpful. Lesson learned.

  13. There was me hoping to rank a commercial intent page this past week and no matter what I tried with internal linking, in-depth content etc, I couldn’t get the pages I wanted to rank instead.
    I could have influenced things with some linkbuilding, but haven’t yet decided which LP I want to build.
    One additional factor was external links on the page without commercial intent, which might just tip the balance the wrong way, but I can’t really remove them.

  14. I would be surprised if more conversions were achieved at position 2, a higher percentage certainly but more in total, doesn’t sound quite right to me…

  15. Hi Frank,

    OK, they should read the math parts, too. :)

    I do think it can be helpful for people who might not know or understand the math to read the paper anyway.

    I’ve had a couple of people call me asking me for help with their Yahoo paid search campaigns as if I were Yahoo customer support, after they found my site through a search. Fortunately, I was able to point them in the right direction without any controversy or ill feelings. I also rank in the top ten in queries for “bookshelf plans” after a post on “virtual bookshelf plans,” and I wonder how many people visit and are disappointed that I don’t have schematics and tips for building bookshelves on my site. Hopefully some webmasters from hardware stores, woodworking sites, and other relevant businesses will read this comment and decide to build some pages about bookshelf plans and help me out of that predicament.

  16. Hi Paula,

    Achieving conversions in what you offer online is a matter of providing the right mix of information and opportunity to people who are or become interested in what you have to offer on your site. It’s possible that those people may visit your site as the first page in a set of search results, or as a lower result.

    Many people try to do some research before they make a purchase, and many look at more than one site when they are shopping for something, and compare what they find.

    The paper I wrote about described how visits to sites weren’t predictable solely on the basis of which site showed up in which position, independently of each other, but rather could be influenced by how relevant those sites above might be. My example is of a site that tended to do better with conversions in the second position rather than the first because, even though the site that would sometimes rank above it was relevant and informative, it didn’t actually offer a commercial service associated with that query.

    That situation isn’t going to be the same for everyone, but I think a compelling title and snippet showing for a page in search results might help capture visitors who are in an information browsing or comparison mode, even if a site isn’t the highest position for a query. And, if what is offered on the page is presented in a way that might make a conversion more likely, then ranking lower after a potential buyer has had a chance to make a comparison with another offering from someone else may not be a bad thing.

  17. Hi Andy,

    The commercial intent aspect of queries and ranking is one that isn’t well defined or explored in this paper, or in many of the papers that I’ve read about rankings and evaluation of search results, but it is something that I’ve seen in actual practice.

    I have seen other papers that discuss the classification of pages based upon factors such as the anchor text pointing towards those pages, and that kind of commercial vs. informational intent plays a role in some of those. It’s an area that would be an interesting followup to this paper, especially since the authors of the paper inserted the issue of navigational queries and how a “perfect” page at the top of results can impact other pages that follow in the results.

    I have seen result sets that are heavily skewed towards providing results that evidence a commercial intent, and it can be difficult to rank well with a page that is more informational in nature. Creating pages on the same site that are more commercial in nature and others that are more informational in nature is one way to try to overcome that issue, but that can be challenging as well.

    I wrote about a Yahoo patent filing a few years back that described a process to classify and distinguish informational from commercial pages, Sorting Commercial Pages from Informational Pages, and Microsoft’s paper on the subject, Detecting Online Commercial Intention (OCI) gave us some ideas of what they were looking for when labeling pages as commercial.

    For queries that are somewhat ambiguous about whether they are commercial or navigational or informational in nature, I think the best approach by the search engines is to provide a mix of diverse sites. And search for something like “purchase dress shoes” should probably show more commercial results than one for “How to build a home computer network.”

    It’s interesting to see how intent is becoming more and more a part of how “relevant” a result might be for a search.

  18. I usually check out the first page worth of hits before I decide to click on one, but the reality of most Internet searchers is that they likely just click on the “top few” and that’s it. I think striving to be #1 and “settling” for being #2 or #3 is a good goal to set.

  19. I agree, sometime getting to the top of a ranking does not result in a conversion.. you have to have a easily converting site that is profitable..

  20. Wow it’s a fantastic post Bill!
    I think that the main important thing is bringing targeted traffic to website, not to gain the best position that you can.
    Companies can invest in internet only if they have conversions not visits!
    I think it’s very important to use snippet with marketing aims, to increase targeted visits and to exclude the visitors that you don’t want.
    The problem here in Italy is that customers doesn’t understand this: they want the “golden positioning” because in most cases is the only one metric thay are able to understand!
    Greetings from Poor SEO Italy ;-)

  21. I think as far as you are on first page it doesn’t matter yo are on 1st or 2nd. Your website gonna get good exposure.

    And I also agree with what Bill said that purpose behind SEO is to make it easier for people to find a site that they are interested in.

  22. Nice article, I think there are blind spots on any web page and that includes search engines. One thing I find interesting is to see a heat map for search engines e.g. google.

  23. Hi Joe,

    I’ll often scroll through a set of search results, and open the ones with titles and snippets that look interesting in a new tab. I like to set my preference for results to the maximum number allowed, whether 100 or 50 depending upon the search engine, and I will often go to a second page in results. I expect that many people don’t look as deeply at search results as I do, but I hate to generalize on how others might search, and that they might limit themselves to the first few results.

    However, I don’t think it hurts to rank amongst the first few results. :)

  24. Hi Anthony,

    Yes – ranking and conversion are two different but related things – getting people to your web site, and getting people to do something once they arrive on your site. Both are important, and neither should be ignored.

  25. Hi Andrea,

    Thank you. I agree completely. I don’t think its very wise to start doing SEO until you have a good understanding of the objectives of a site’s owners, and who their audience might be. The objective of a site is never to just be seen by as many eyeballs as possible, and the audience is never “everyone.”

    It is easy to understand being number 1 as a metric, and that is a common mistake even outsite of Italy :) But we both agree that if the term you’re ranked number one for isn’t very relevant for what your site offers, or very appropriate for how you present what is on your site, or isn’t a term that your audience will likely use to find your site, then it’s a pretty shallow metric.

    Build a site for people who might be interested in what it offers, make it easy for them to understand that it is for them, and make it easy for them to find it. A title and snippet should give potential visitors some confidence that the page that exists beyond the search result offers what they want to see. Regardless of whether that result appears at the top of the search results, or second, or even somewhat lower.

  26. Hi Cam,

    Funny – I was thinking about heat maps when reading this paper, too. I like the idea of heat maps, and mouse click trails, and click overlays, and other ways of recording how people view a web page, but it’s always nice to have more information when digging down into how a person is viewing a search result.

    My first impression upon seeing the well known “golden triangle” of search – a heat map pattern that showed people looking at search results by viewing a line from left to right, and then at a diagonal down and to the left was that it was a viewing pattern that many viewers would use with any written document (except in cultures where people read from right to left, and from the bottom to the top of pages).

    But the heat map research that uncovered that golden triangle also included much more complicated patterns that weren’t discussed as much. Regardless of that, I like the idea that heat map information be discussed and used in conjunction with other information, such as a discussion of the tasks involved and other user data collection methods.

    Patterns do happen, but exploring patterns without looking at the reasons for those patterns or additional collaborating evidence is risky, and prone to potential false negatives and/or false positives.

  27. Hi Cam,

    I’ve seen a couple of those, and I agree with you that they can be useful, and worth spending some time investigating. The paper I wrote about discusses how search engines can evaulate how well their sites work, how much value they can provide to visitors, and provides them with ideas on how they can make changes to what they offer to help better meet their objectives. That’s something that site owners regardless of whether they are search engines or not should think about as well. Heat maps can be informative tools to help learn more about how people use your web site.

  28. Interesting points about not always seeing the best conversions in position one. However if position one on average gets around 40% of the total clickthroughs, then surely thats the place to be?

  29. Hi Jason,

    It’s rarely a bad thing to be listed at the top of search results, regardless of what the click through rate might be for that position.

    I would imagine that the clickthrough rate for the first result for some queries is much higher than 40% as well, and possibly much lower for others. For a navigational query, such as “ESPN,” chances are pretty good that most people will either click through to the website for ESPN, or one of the quicklinks that appears included with that result. Chances are pretty good that the ESPN website is at the top of the results for that query as well.

    There’s also a strong bias on the part of searchers to believe that a top result is often the best result for a search, and the discounted cumulative gain approach that search engines often use to evaluate the relevance of their own results uses a prediction model on how many times people will click on a result based upon its position in search rankings independently of other factors, such as how relevant the result was in the position above theirs. That’s what the paper I’ve written about in this post disagrees with.

    I used an example about a search result that tended to have a higher conversion rate when it appeared second in rankings because often when people search, they aren’t just looking for the most “relevant” result, but sometimes they are also comparing what they find at more than one result, and the difference in those results were that one offered information only, and the other offered information and the chance to also make a transaction that wasn’t available with the information-only result. In that instance, being second in the search results wasn’t a bad place to be.

  30. Hi Bill

    It really depends on what you want to do with your website/blog etc and what information you are providing. If you are there to make money in a home based business or any business for that matter, then being in the top 3 spots is better than being in the bottom part of the page. I like your analogy that when you were in the second spot, your conversions were better. As time goes on, I suspect most people will look at more than the top 3 results before moving on.

    The best approach is to write real and meaningful content that will assist your readers and offer a solution to a problem they have. By doing so, you build credibility, get natual links back to your site and depending on what you are offering in terms of a service or product, you can build an income. Of course you have to go out and build back links to your website as well.

    Part of the problem that we all have, is really limited information on the actual elements or factors in search engines algorithms are a trade secret, so nobody really knows outside of the company, what all the factors are for relevancy.

    Bill do you have a breakdown on the percentages on the top 3 spots versus the rest of page 1 for how many people click on each result?

    Cheers

  31. Hi Darrell,

    Just quickly, before responding directly to your comment, I want to say something that probably should have been said in the body of my post. The purpose behind SEO is to increase the amount of traffic to a site from people who may be interested in what that site has to offer. That may mean that the pages of a site might be optimized for specific terms that are a mixture of competitive, relevant, popular and appropriate for the pages that they appear upon. It also may mean that the pages of the site also show up well in search results for a large mix of long tail terms, for words that appear upon the pages of the site, but which might not be the primary keywords that were focused upon in optimizing those pages. In the end, when considering a return on investment for search engine optimization, how well ranked pages might be for a site are much less important than whether or not the objectives of the site are met.

    Helpful, useful, meaningful, engaging content can make a difference, and definitely some pages of a site should focus upon helping consumers make informed decisions when on your pages. On other pages, less text and content may be better, especially if that content keeps someone from actually puchasing, or downloading, or playing a game, or doing whatever it is that they came to your site to do.

    I don’t have a breakdown on the percentages of the top three spots versus the rest of the page, and actually, that’s one of the things that the authors of the paper are railing against. Trying to predict how many people will click on a certain result based upon the position of that result is a flawed way of thinking about how people interact with search results.

    If the first search result is a perfect result (a page that meets the needs of a navigational query very well, like the homepage for ESPN when someone types “ESPN” in their search box), the probabilities that someone will click on the other results goes down tremendously. Or if the second result meets the informational or transactional needs of a person visiting those results, it’s possible that the percentage of clicks on lower results will not fit some prediction of click throughs based solely upon where they are located in those results.

    That approach of predicting clicks based upon location, without considering other factors, is one that many search engineers have used in the past to evaluate how well a search engine is doing in providing relevant search results. It’s known as “discounted cumulative gain” and the authors of the paper point out that you can’t just look at position alone.

    There are other problems with looking at something like an “average” percentage of clicks for the top three spots versus clicks for results on the rest of a page to try to predict how much qualified traffic a listing in search results might send based upon locations.

    One is that some words have more than one meaning, and a search engine might try to show a diverse set of search results. For example, someone searches for “Java,” looking for the Island named Java, and the first five results in Google are for the programming language. The sixth result is for the wikipedia entry for the Island. If you wanted information about the coffee known as Java, you might not see one until you get to a result for a tutorial on brewing java, at 27 in the search results.

    Another, as you mentioned, is that if you are looking to buy something based upon your query terms, and the titles and snippets from the first 7 or 8 results pretty much look like they are completely informational results, and wouldn’t give you a chance to make that purchase, but the 9th result does, then the intent behind that search (and a well written title and snippet) increases the odds that someone looking to make a transaction will click on that result, even though it’s near or at the bottom of a page.

    Many of the algorithms behind search engines are trade secrets, but we sometimes get some glimpses at methods and assumptions behind those algorithms while looking at patents and white papers. I do a lot of that here.

Comments are closed.