We’re All Google’s Lab Rats

Sharing is caring!

A recent comment here noted that the core algorithm behind how Google works hasn’t changed very much since its earliest days. I’m not sure that I agree. Many of the posts I’ve made over the past five years that involve Google patents and whitepapers describe ways that Google may be changing how it determines which results to show searchers.

Many of the changes Google makes to its algorithms aren’t always visible to its users, while others that change the interface we see when we search tend to stand out more. Interestingly, many changes that Google makes are based upon live tests that we may catch a glimpse of if are lucky, and we pay attention.

Google’s Testing Infrastructure

At Google, experimentation is practically a mantra; we evaluate almost every change that potentially affects what our users experience. Such changes include not only obvious user-visible changes such as modifications to a user interface, but also more subtle changes such as different machine learning algorithms that might affect ranking or content selection…

From:

Overlapping Experiment Infrastructure: More, Better, Faster Experimentation (pdf)

At the KDD-2010 (16th Conference on Knowledge Discovery and Data Mining) conference being held in Washington DC, this week, Google is presenting on the infrastructure behind how they test changes to their interface, as well as changes to the ways that they may rank pages or choose content to present to viewers.

As they tell us in the introduction to the paper:

Google is a data-driven company, which means that decisionmakers in the company want empirical data to drive decisions about whether a change should be launched to users. This data is most commonly gathered by running live traffic experiments.

These kinds of experiments can include user-visible changes, such as changing the background colors of ads, and non-visible changes, such as one that might predict the clickthrough rates of those ads.

We’re told that the aim behind Google’s experiment infrastructure can be described with the words “More, Better, Faster.”

More, as in the ability to run more experiments simultaneously, while being flexible enough to use different configurations as needed.

Better, as in the ability to catch and disable quickly experiments that may be worth testing but which might be buggy and have unintended bad results.

Faster, as in the ability to set up experiments easily, so that even non-engineers can run an experiment without writing code.

Google Algorithm Changes

We’ve been told by representatives from Google that they are constantly updating and changing the algorithms behind their web search.

In the New York Times article, Google Keeps Tweaking Its Search Engine, Google Fellow Amit Singhal describes how Google tested a new algorithm to make certain searches contain fresher and more timely results.

A Wired Story, Exclusive: How Google’s Algorithm Rules the Web, notes that Google will likely introduce “550 or so improvements to it’s fabled algorithm,” over the next year (2010). We’re also told that:

“On most Google queries, you’re actually in multiple control or experimental groups simultaneously,” says search quality engineer Patrick Riley. Then he corrects himself. “Essentially,” he says, “all the queries are involved in some test.” In other words, just about every time you search on Google, you’re a lab rat.

In a video from April, 2010, Google’s head of Web Spam, Matt Cutts mentions that Google tends to make at least one change to core web search algorithms at least once a day:

Testing Tools and Education

The Overlapping Experiment Infrastructure paper tells us that a testing infrastructure by itself isn’t enough – that Google has also developed a suite of tools to help them measure the impacts of tests, which include real-time monitoring of basic metrics to determine when something unexpected might happen during a test.

Google also has an Experiment Council, consisting of a group of engineers who review a checklist that needs to be filled out before an experiment takes place. The checklist is described as “light weight,” but we’re told that it covers a number of areas such as:

  1. What does the experiment test?
  2. The basic parameters of the experiment
  3. What triggers the experiment – how are specific searchers pulled into live testing?
  4. Which metrics are of interest during the test?
  5. How big or small is the experiment, and how long will it run?
  6. The design of the experiment itself.

The idea behind an experimenter filling out a checklist and submitting it for review is to help educate experimenters on the processes behind implementing an experiment, and to follow best practices in experimentation.

Google also has a discussion forum where experimenters bring the results of their tests to experts to talk about the results of their tests. There are three listed purposes behind this forum:

  1. Make sure that the experiment results are valid
  2. Make sure that the metrics behind valid experiments are complete
  3. Decide whether or not the experiment creates a positive or negative user experience. This last step can help decision makers decide whether a a change should be launched, refined, or given up.

Conclusion

If Google makes at least one change a day to the way it provides search results, a testing infrastructure, testing tools, and educational processes like the ones described in the paper make a lot of sense in managing those changes.

If you’re interested in learning more about the framework behind the experiments that Google live tests on its users, you may want to spend some time with this paper.

If you see something unexpected at Google when performing a search, keep in mind that at any point in time you may be, to use Patrick Riley’s term, a lab rat.

Sharing is caring!

63 thoughts on “We’re All Google’s Lab Rats”

  1. … “If you see something unexpected at Google when performing a search, keep in mind that at any point in time you may be, to use Patrick Riley’s term, a lab rat.”

    Nice Bill.

    Speaking simply of SERP’s, the problem ‘SEEING’ something unexpected is that typically effectively collecting, aggregating and maintaining volumes of information during a larger SEO project require the use of many automation tools, from various locations; effectively limiting actual “Eye-Ball” time. And when obsessed enough with a particular client vertical where “Eye-Balling” is more addiction than value-added, what is ‘seen’ is usually too small a sample of variability to predict in other verticals, industries, etc…

    While not able to provide data in any meaningful or scientific manner, I regularly see consistent variability on various PDA’s.

    Michael

  2. I am constantly seeing small tweaks.. for instance today i am seeing a purple / light coloured background to the top level PPC ads instead of the regular Yellow one. Lots of things every day, its hard to sometimes tell the difference also.

  3. I know Google is constantly improving but I didn’t really know how frequently they do it. Well, wouldn’t hurt to be a part of the experiment at least I’m contributing to something.

  4. Hi Michael,

    Very good points. Interface changes are much easier to notice as well, while other changes such as new machine learning approaches, categorization of pages, rerankings of results based upon many different kinds of factors, and other behind-the-scenes type alterations of algorithms may not be so easy to spot – regardless of whether you’re spending a significant amount of time looking at search results, or you’re trying to use some automated tools.

    In addition to that, if you do see changes related to how pages from one or more sites are ranked for specific queries, or sudden changes displayed in your analytics program, it isn’t always easy to attribute those to a change in the way that a search engine is working. Changes can also be attributed to something that others have done on their websites, a shift in searching patterns or consumer tastes, and even changes (whether intentional or unintentional) to your own pages.

    While we do sometimes learn about some of the changes a search engine like Google has made (such as the announcement that we would see more results for “synonyms” of query terms in search results a couple of months ago on the Official Google Blog), we often don’t hear about many of the other alterations to their search algorithms. That’s one of the reasons why I like looking at patent filings and whitepapers from the search engines – so I can at least get some hints at some of the changes that they may have made, or may possibly make in the future.

  5. Hi Kristen,

    That’s one of the reasons why I subscribe to a lot of different internet marketing blogs via RSS – sometimes people who do see something odd capture screenshots of changes. It’s easier to pick up on actual physical changes to the pages that the search engines show up though, then things that might influence how pages are ranked in search results.

  6. Hi Andrew,

    That’s one of the reasons why I thought it might be a good idea to include some statements on how frequently Google might be making changes, like the Matt Cutts video above. I’m not sure how many of those changes they make might influence everyone, or us directly – some of them could include things like better indexing for pages written in simplified chinese, for example.

    What isn’t in doubt is that search engines are evolving over time. I did find the “educational” aspect of Google’s experimental approach to be pretty interesting. It’s good to see a process in place for reviewing experiments both before they take place, as well as a detailed analysis of the results afterward.

  7. It’s alive!

    Perhaps we should turn the tables and archive the evolution of Google on a moment by moment basis. Automate a search on a handful of terms and take snapshots (text and screen shots) of the results. Over time we’ll be able to play back this evolution and see how the Google organism grows.

  8. Good article with bold title. Don’t worry, let Google do experiments on us, we will resist and update our SEO immune to fight and survive with wisdom. Human mind is more beautiful than Google’s algorithms. He He 🙂

  9. Great article. I have often wondered about how all these changes will effect rankings in future. I am sure one day they will change something that will really shake up the serps but I am glad that they are only relatively small tweaks for now!

  10. Yes, the World is a Google experiment :D. They are always experimenting and testing new stuff and they show them to some people, then gather data and make decisions. They make many new services so they can gather more and more data. Watch this video about the Google Toilet 😀 http://www.youtube.com/watch?v=hrontojPWEE

  11. It would be great to know what tests they are carrying out and on what basis though. I think we’d all be agreed that there are more and more spammy sites making it to the top of the SERPs – so they must be missing something or being too specific on their tests. Each change they make, will surely have a negative impact somewhere else. Negative impact for us of course! 🙂

  12. Great article, thanks. Being a lab rat is not always boring and bad. As new in the business, change can be very positive. Change gives newcomers a chance to make it to the top of the serps.

  13. Terrific post, Bill. I’ve always appreciated your thoughtful insights into Google’s scientific method and ranking algorithm. You’re destined to do a TED talk someday on search engine R&D and patents.

    I love that Google gives engineers the freedom to speak openly about their research and, at times, commit what once might have been a PR faux pas and now, simply, one key dimension of Google’s Weltanschauung. I’ve always thought of us as experimental subjects when searching on Google. And Google?

    Watchmen.

    Perhaps closest to the online embodiment of Dr. Jon Osterman/Doctor Manhattan than anyone wants to admit: not God, but a quantum superhero, evolving further and further from the technologically-challenged 20th Century man.

    Fortunately Google leadership is more Feynman than Oppenheimer and the experiments appear to be benign. There are far worse pre-apocalyptic or even Medieval scenarios that might have played out if the ratcatcher weren’t so Googley: rat-baiting, for example. On the upside, Riley might have called us mice, which haven’t made nearly as great a contribution to modern science as the domesticated albino Norwegian rat.

    Great recommended reading, by the way. The story behind your sidebar recommendations deserves a post if you haven’t written it already. Just a friendly nudge for your culture category.

  14. This is really amazing. Nice video of Matt cuts. I was really surprised to hear him say that, “Google tends to make at least one change to core web search algorithms at least once a day”. It is even more amazing to hear that we are “lab-rats” at times.

  15. I am seriously just waiting for Google to take over the world. Nothing can stop them. They are just so friendly and offer everything for free. I mean what could go wrong!? Right?

  16. Watching Google since its very beginning, I can say for sure that the algo has changed. And has changed quite a bit. Their goal is to provide the best search results they can. To do that they must change to keep with the spammers. I’ve charted this to explain the concept to my clients.

    I’ll point out one of the most noticeable changes is how Google uses PageRank. All the white papers and such listed here is a fairly good indicator too. It’s the smart SEO that rolls with the changes. Beyond that, if you are a SEO that uses best practices, the little changes that happen don’t trouble you much. My 2 cents.

  17. i am agree with you as you said, “Many of the changes Google makes to its algorithms aren’t always visible to its users.” and because of this it becomes very difficult to find out what changes they made.

  18. Seriously, Google has OCD with tweeking and its results and they should be. I think that Google observes everything now even though other will say that theres no way they are going that far. Do you think they are observing ones Google Analytics account and looking at bounce rate, time on site, % of return visits and similar data like that for ranking?

    Or is this illegal? I haven’t read the Google Analytics policy either so what does everyone think? Either way, I am sure they are testing CLR on the SERPS and if people return back to them quickly after.

  19. Hi James,

    Google and the other major search engines have been doing something like that for a while now. First they focused upon creating a graph or map of the Web and the links between pages, and compared them over time to see changes, to see what they could learn from those changes. More recently, they’ve been looking at a graph of clicks on links between pages to better understand how people actually travel across the web, and comparing different copies of those graphs across time.

    I wonder if they’ve been charting their own evolution as you suggest. It would be interesting to see something from them that describes their growth and evolution. I’m not sure that many others have the tools to be able to do so.

  20. Hi Gaurav,

    Thank you. I thought about which title to use for this post for a while, and that choice seemed to stand out.

    Google’s experiments go beyond SEO, to the evolution of search and the way that people search. For example, it makes sense that if more people begin search for videos that the search engine make videos easier to find and view.

    Hopefully, the choices on changes to make are wise ones. If you’ve seen critiques such as Is Google Making Us Stupid?, you may be wondering the same thing.

  21. Hi James,

    There have been changes to ranking algorithms in the past that have affected a large number of sites, or that have gotten a large amount of attention. For instance, a recent change that seems to have impacted how much traffic a number of sites receive for long tail queries have spawned massively long threads in places like Webmaster World.

    Something to keep in mind when you do see changes in search results for your pages – there may be many possible reasons for those changes, which can include:

    1. Something that’s changed on your site – either based on something that you’ve done, or possibly a change to your server or a change made by your host.

    2. Changes made by other sites that rank for terms that you may be competiting for.

    3. Changes made by the search engine involving how it ranks or displays pages.

    4. Changes in the way that people search.

  22. Hi Tom,

    Sometimes we do learn about experiments that Google is performing while those experiments are happening, for example when someone sees something odd and blogs about it, or mentions it in a forum thread.

    It can be really hard to tell if a change results in more spam or less from a vantage point that doesn’t always allow us a bigger picture. Do we see more spam results because of something the search engine does, or because people spamming are trying other approaches?

    There is also always the possibility of a change to a search algorithm having unintended consequences that may not be very popular.

  23. Hi nikolainikolovvv,

    It can be a little intimidating to learn about how much information search engines collect, and think about possible ways that information might potentially be abused. But then we are also seeing efforts from the search engines to try to protect our privacy as well, such as in this announcement from the Official Google Blog:

    Search more securely with encrypted Google web search

    Google also provides a number of ways that you can use to make sure that they capture less information when you perform searches on Google, such as turning off their data collection, and not allowing them to capture your Web history. I think those are positive signs.

  24. Hi Johan,

    I agree with you. Changes shouldn’t always be looked at as obstacles – often they are opportunities. For example, when Google introduced Universal Search, it pushed down some web pages in search results so that Google could show Maps and Images and Videos. While you could possible complain about those web pages being pushed down in search results, you could also make some videos, use better images, and work on optimizing your pages better for local search.

  25. Hi Kevin,

    Thank you. I’m not sure about that TED talk – would that be a topic potential viewers would find fascinating? I guess it might be if approached in the right way. 🙂

    I do feel much more comfortable with Google’s experimentation than I might with that done by many other search engines – maybe that’s more a matter of how they’ve consistently presented themselves over the years than anything else, but their publication of how they perform experiments is on the same transparent level that we’ve seen in blog posts, Google papers, and involvement in marketing and technology conferences. Definitely more Feynman than Oppenheimer.

    Nice suggestion on my recommended reading – thank you for that. I may just write that post.

  26. Hi Joel,

    I know that Google has shared information about how many changes they make it some pretty high profile places, like the New York Times, but I really don’t think most people are really aware of how ofteh those changes may take place.

  27. Hi Sean,

    Regardless of the best of intentions, unintended consequences do happen. As much as I like search, I try to hold on to a rational skepticism about changes that the search engines might make.

  28. Hi Donnie Lee,

    Every so often, someone writes an article of a blog post about a change at Google with a title that goes something like “SEO is Dead.”

    The truth is more that just as the way people search evolves, and the search engines also evolve, SEO also evolves as well.

    One of the reasons why I spend a lot of time keeping an eye on search patents and papers from the search engine is because they sometimes provide a window into possible changes that we may see in the future. Maybe not spelled out in fine detail. For example, if a lot of what we start seeing in documents like those point to an increase in how user-behavior data is being measured and used by the search engines, we should asking ourselves how, and where and why, and when. Rather than saying that SEO is dead, we should be looking to see where it goes next.

  29. Hi SEO Eric,

    It’s tempting to think that Google might be looking at the analytics of individual sites, and they do have a program where you can share information about your site to see how you fit within similar sites. But I’m not sure how much they use that information.

    Google collects an incredible amount of information from many places, including their own log files and search and browsing histories from users, and have so much data that one thing that might really challenge them is deciding which data to pay attention to, and which data to ignore.

  30. @Bill Slawski
    I am not very familiar, but I guess encrypted search is to protect us from other people, not from Google it self 😉
    Google is just getting so big and powerful. Yes we can take some measures to protect our data, but the normal user does not know anything about that. So google will always get the data they need. Even by just leaving the option they need to be default setting 😉

  31. But is there any way we can request google to crawl my site real time or atleast when an article is published.

  32. Hi nikolainikolovvv,

    That’s true, but I mentioned that because I think it shows that Google is presenting itself as being concerned about protecting the privacy of its users.

    I’m not going to say to not be concerned about the amount of data that Google collects, because that’s something that worries me as well.

  33. Hi Susan,

    If you’re using WordPress for your site, you could try the Pubsubhubbub plugin which might help inform Google quicker that there’s new content on your pages for it to index.

    More about Pubsubhubbub here

    Of course, there’s no guarantee that Google will crawl your pages faster – all the search engines move at their own speed, and take a good number of factors into consideration when they might crawl pages, but Pubsubhubbub might help.

  34. I would be ok with being a lab rat if something were happening with the results out of what I do and contribute… Yet there is no real shift in strategy; at least not from what I see.

  35. Hi Lars,

    I think if you were to look at search results for many queries ten years ago, and compare them to search results today that you would notice a lot of differences, and I suspect many of those are influenced by those experiments, by looking at user-data from people who use the search engines, and what they select in results, what they decide to view, which pages they spend time upon, and more.

    There are a lot more query refinement suggestions in results now, universal search results that weren’t there ten years ago, including news and videos and blogs and books and more.

  36. The concept of being one of googles lab rats really does ring true. In the UK we were really hit by the caffeine update as it really affected our local search, it felt like google’s reaction was ‘oh, yeah, and…?’. But what can we do, until another search provider takes anywhere near the market share of google (will it happen?) then we’re always going to be at their whim

  37. Hi Graeme,

    That’s one of the reasons why I think it’s a good idea for webmasters to learn as much about SEO has they can, so that they have some idea of what to try to do when faced with an update that might create upheaval and changes.

    It’s also a good idea to try to develop other ways to get traffic to your pages that might not be so dependent on search engines, or upon any specific search engine.

  38. The post is really a very good point of view. I am new to blog posting but I think I can see that there are changes from time to time. We all know that nothing is constant when we say online stuff. Google is definitely making its way to being the best that is why there are certain changes.

    This is really great Bill. Thanks for sharing your point. 🙂

  39. I think this post and the paper it stems from make a great point in helping SEOs to educate end clients of SEO consulting and services. It’s good to be able to point to the fact that the major SEs are constantly testing and that any given query is part of multiple tests. All of this is in the name of improving search, but sometimes it can lead to unexpected results–even when their goal is to mitigate this, as is mentioned above. It’s great to show clients that these unexpected results sometimes can lead to ranking fluctuations and that it’s just the way the engines work. Often results will return to “normal,” but it reinforces the notion that it’s good to have a diversified SEO and overall online marketing strategy to limit the effects that test results may have on any particular ranking result.

  40. Hi Ramsay,

    A very good point. One of the reasons that I got into SEO was a concern that sites would be lost if they couldn’t keep up with the search engines, and their constant changes, and attempts to improve search. As you note, a diversified SEO and overall online marketing strategy can really help.

  41. “Lab rats” has such negative connotations. There is nothing unethical about what Google is doing, and you’ll find other large companies do the same thing.

  42. Hi Donna,

    The term “lab rats” in the title to this post was one that I took from the Wired Magazine interview with Google search quality engineer Patrick Riley. I didn’t mean to imply or suggest that what Google is doing is in any way unethical or negative. But when Google does do live testing rather than testing under controlled conditions, they are treating us as their test subjects, as if we were indeed lab rats.

  43. Hi Bill,

    No offense intended to either you or Patrick Riley. I meant it as a “general” remark. I agree it is an apt metaphor 😉

  44. Hi Donna,

    No offense taken. I do try to stay away from metaphors or cliches as much as I possibly can when I write, but I found that I couldn’t help but use the line from the Wired article in the title – it seemed like the best way to convey the concept in as few words as possible.

  45. Just watched the Matt Cutts video (seems to be an OK guy). Two things struck me:
    1) As mentioned above by Andrew, I didn’t realize changes were made by Google on a nearly daily base. Good grief.
    2) Cutts says: “… I think content is necessary, it is not always sufficient”. His use of the word “think” says it all. Not exactly very reassuring. So here we have a big gun from one of the biggest web-guns telling us he “thinks” content is necessary. It shouldn’t be this way. Creating useful/interesting content and the ability to promote it, are 2 different fields. I do realize we live in a commercialized world, but to me it’s just a pity that content is losing the battle minute by minute.

  46. Hi Doreen,

    It’s worth spending some time watching Matt’s videos. Sometimes he presents things in a way that might make things clear that you may not have been certain of before.

    I’m not sure that you should read so much into Matt’s use of the word “think” as an indication of uncertainty on his part. Regardless of how great the content might be on your pages, if you don’t have at least some links pointing to those pages, people are going to have trouble finding it.

  47. We are definitely Google’s “Lab Rats” but like the point that many have made above, the test are made to improve the quality of the user experience and the search results. As SEO’s I think it gives us an advantage over slow moving, non-innovative companies that do not have the resources to keep up will the latest search engine updates.

  48. @ search monkey

    all the sudden, unpronounced, against better judgement, contradictory to party line espoused yet mostly spammy results, are designed to improve the user experience and results?
    SEOs know very little much more than the guy that logs into his google account before searching.

  49. Hi Search Monkey,

    One thing is definitely true – Google is striving to continue to be known as an innovative company. Whether that gives us any kind of advantage or not, I don’t know. But they do seem to be in a race to improve search, even if none of the other search engines can quite keep up.

  50. Hi Michael,

    What we really don’t know, and may have difficulty gauging is how much spam is actually proliferating and advancing itself. With any ranking algorithm, there are always going to be people who attempt to take advantage of it. If you’re seeing more web spam, it’s not necessarily because Google isn’t trying to do something about it, but possibly because there are a great number of people who are trying to take advantage of how Google might work.

  51. I love how Matt Cutts relates even the most complex elements of the algorithm in an understandable and easy to grasp manner. He is the guru of all gurus in search.

  52. Hi Matthew,

    Matt does do a good job of simplifying the issues he discusses, though sometimes I’d like to hear the more complex answer, which he may not be able to share.

  53. What I see no matter on what algorithm changes occur and I am and SEO novice. Is that they still heavily give weight to links, doesn’t seem to matter where those links come from, just links. Also domain age is also key.

    I have seen website rank high with bad/no content, no backlinks but 6yrs old. I am still waiting for them to drop out of the SERPS.

    Another site, same competitive keyword has now been in position 3 for nearly 3 months with links from adult sites, links from sites that redirect to his/her website. Again, this site shouldn’t have even got to page one from my research but there it is.

  54. Hi Darius,

    Of course Google gives value to links. It’s the basis behind PageRank, and behind the hypertext relevance involving anchor text. But there are many other signals that Google uses as well, and it looks like they are adding more signals regularly.

    There’s very little that we can show as proof in any way that Google values domain age, and it may or may not be a factor that Google considers.

    Is there a particular reason that you might think that links from adult website or links that redirect to a site shouldn’t be counted by Google?

  55. What I am saying is, if you build enough links, no matter what the quality of your site or whether it is relevant / less relevant than another site, you seem to rank higher. I understand google need some kind of measurement but personally the quantity of links proves nothing regarding how good a site is, it just proves that people have money to pay for SEO or time to do SEO

    I understand there are lots of factors that probably go with this too.

    As for domain age. I have seen a site that does not have a great deal of links or quality links or good content and it ranks high on a competitive keyword. The only thing I can really see that it has going is Domain Age.

    Adult Sites. True. I don’t know about that one, as you say, why should they be discounted, a link is a link.

    Redirects. Well. To me. If there are a lot of 301 redirects from the link url, I would look at that as being something fishy.

    As you say. Who knows what goes on behind the curtain, just sometimes, with that site I mention ranking high, I just can’t see the logic behind it…

  56. Hi Darius,

    Yes, we can’t be sure what actually goes on behind the curtain sometimes. I have seen brand new pages linked to by a very high PageRank page appear out of nowhere, and rank in the top three results for queries that are both very competitive and had tens of millions of results. One link for each page.

    But if you searched for the back links for that page, you might not see the one from the very high PageRank page since Google only shows a random sampling of back links with their link search operator.

    Links do still have value in Google’s ranking algorithms, but Google does seem to be adding more and more signals, and pages with lower PageRanks showing in the toolbar will often rank above pages with higher Toolbar PageRanks. I mention the toolbar PageRank because it’s really the only metric we have to see the link analysis value of pages.

  57. Let me ask one final question then. As I don’t fully know how it would work..

    Lets say I create an amazing site, it has all the true answers to the universe and the meaning of life explained on it. Something everyone is searching for (not really, but you get my drift)

    I create the site. I possible add it to webmaster tools but that’s it. Nobody links to it, it just sites there. It has 10 pages, great content e.t.c

    Is this page ever going to get to the page 1 result 1 of GOOGLE for ‘Secrets Of The Universe’ (A highly competetive keyword)

    If not, then isn’t that a major flaw in ranking algorithms right there. Shouldn’t naturally great sites, be found naturally and float to the top naturally based on content alone?

  58. Hi Darius.

    Sounds like a great website.

    Chances are that if no one links to the site that it might have some difficulty ranking.

    It’s a little like Picasso creating one of his greatest masterpieces and then locking it up in a vault where no one can see it.

    If you verify it in Google Webmaster Tools, and submit an XML sitemap there, Google stands a better chance of discovering the pages of the site, but without any links to it, it just may not rank in any way. Chances are that Google is also using their toolbar to discover new pages as well, but again, without any links, it’s going to have a hard time ranking for much of anything.

    The link-based aspect of Google’s ranking algorithm is a measure of importance, and while Google may be working on other importance signals that might be independent of links (social signals, author association signals, quality signals, etc.), at this point links are still an important aspect of how Google Ranks pages.

  59. After Panda update of 2011 I started to take SEO seriously but after reading this article I truly understand that what kind of a company Google is really.I’m simply amazed by the amount of data they are collecting each day with every experiment.After all there are around 60 million users who visit google everyday .And if they are conducting one experiment everyday then it is going to be quite a large amount of data

  60. Hi Hamza,

    I think I read recently that Google conducted about 50,000 experiments last year, and incorporated about 500 changes to their core ranking signals for web search. Some of those experiments might not be all that earth shattering, such as changing the font size for something they might display, while others could potentially have a significantly bigger impact.

    It makes sense for them to try to look at as much data as possible, just like any other website owner might when they are contemplating making changes to their websites. As a very big website, they have a lot of data to look at.

  61. I strongly beleive that Google always tests its algorithms 24×7.

    The problem however is with my Analytics, Google Analytics always change when the algorithm effects my sites.

    But apart from that the other analytics like CloudFlare and Chartbeat show no significant changes. I beleive every analytics algorithm must be the same apart from Google. Google somehow modifies all its algorithms related to each other and it affects the end user websites.

    I suggest you should make a post on Google Analytics versus other Analytics someday. That would be an interesting read 😉

Comments are closed.