Does the amount of time it takes for a page to load in a browser influence search engine rankings for pages? Should it?
If it did, might sites that were all text show higher search results than sites that included pictures and different applications? Or, might a search engine find a way to account for different types of sites and the amount of time it might take them to render in a browser, based upon actual user data in addition to the amount of time it takes a site to render in a browser?
A recent patent application from Yahoo explores ways that a search engine might consider the amount of time it takes different types of pages to render and other issues involving how quickly pages respond to visits in the ranking, classifying, and crawling of those pages.
Latency is a big fancy word that means the amount of time between when something was started and when you can see its effects. It’s a word that shows up very frequently in the Yahoo patent filing. It’s a word worth learning a little more about, especially when it comes to websites, how people use them, and how a search engine might track that use.
A search engine may look at a wide range of information to decide whether or not it will visit and index pages on the Web, how it might rank those pages in search results, and how it may classify those pages.
A search engine will likely consider a wide range of informational signals. Those can include the content that appears on web pages, links, and the text within links that point to and from pages, information about how people use specific web pages, and other information about pages and the sites that they appear upon.
A search engine might also look at how quickly pages load and render in a browser, how much people might tolerate when pages load slowly, and how good experience websites might deliver to their visitors.
The patent filing is:
Web Document User Experience Characterization Methods and Systems
Invented by Konstantinos Tsioutsiouliklis and Marcin M. Kadluczka
US Patent Application 20090187592
Published July 23, 2009
Filed January 23, 2008
Methods and systems are provided that may be used to characterize in some manner the performance that a user may experience when accessing a web document. An exemplary method may include accessing at least one performance characteristic associated with at least a portion of a computing environment adapted for sharing at least one web document and establishing user experience information associated with the web document based, at least in part, on the performance characteristic.
Informational Signals and Search Engines
When a search engine ranks pages in search results, it will explore signals that indicate how relevant those pages are to queries that might be used to find them, such as using words upon those pages that appear in those queries. A search engine may also look at signals that indicate the quality of the web pages that it might list within those search results.
A measure like PageRank is supposed to indicate quality rather than relevance because it looks at the number and “importance” of links pointing to a page to try to determine how important a page might be. There are other quality signals that a search engine may use. Some examples might include the amount of text upon a page, how readable that text is if the page contains broken links, and possibly hundreds of other factors.
A search engine wants to return pages in search results that are both relevant and high quality.
Another set of signals or factors that a search engine may use involves interacting with pages that they find on the web. These can include which pages people select in search results when they see them in search results for a specific query, how much time people might spend on a page they’ve selected before they return to the search engine, how far down a page they might scroll, whether they bookmark or save a page and others.
This patent filing focuses upon how well pages might meet “desired user experiences” by looking at the performance of web pages and actual user interactions with those pages. It tells us:
With so many websites and web pages being available and with varying hardware and software configurations, it may be beneficial to identify which web documents may lead to the desired user experience and which may not lead to the desired user experience.
By way of example but not limitation, in certain situations, it may be beneficial to determine (e.g., classify, rank, characterize) which web documents may not meet performance or other user experience expectations if selected by the user. Such performance may, for example, be affected by the server, network, client, file, and/or like processes and/or the software, firmware, and/or hardware resources associated therewith.
Once web documents are identified in this manner, the resulting user experience information may, for example, be considered when generating the search results.
User Experience Characteristics
The patent filing considers much more than how quickly pages load into a browser, and it may influence more than just the rankings of pages.
It tells us about an information integration system that can be used with search engines, job portals, shopping search sites, travel search sites, RSS applications, and other types of pages, and how it might look at those in at least three different ways:
Access – How quickly it takes to access a page or other kind of document when sending a request to retrieve a page or document. Measuring access might mean looking at performance characteristics associated with a page, such as a server performance and file performance. It might consider how quickly a page might load for visitors at different connection speeds, such as broadband and dial-up. A search engine crawling program might simulate connections at different speeds to measure how quickly a page loads for visitors coming to a page through dialup or broadband connections.
Rendering – How quickly a page starts showing up within a browser (and it might emulate several different types of browsers), how a page loads in a browser, and how long it might take for the full page, or at least the part of the page above the fold to load in a browser. It contemplates that on some sites, some large pages might be set up so that even though they contain a lot of content, the content at the top of the page renders quickly so that a visitor doesn’t have to wait very long to start reading and viewing the content on the page.
It may also consider such things as “differences in complexity, size, many files, user interface mechanisms, embedded sections (e.g., advertisements, audio content, video content, security features, etc.), and/or the like,” to understand how a page renders, and how good of a user experience that might be.
User Experience – How do people actually use web sites, and how do they react to different access and rendering issues on different sites?
Different people might have different levels of patience in waiting for a site to load and render in a browser, and they might be willing to wait longer for some types of sites to load and render than others. For example, someone might be willing to wait longer for a page to show up associated with their bank account than for a “more generic” type of page.
Examples of other “user related performance characteristics” could include how visitors to pages react to things such as:
- Pages that fail to download or render within an acceptable period of time,
- Pages that automatically play video or audio content,
- Pages that include pop-up or pop-under advertisements,
How Measuring Latency and User Experience Might be Used
The inventors behind the patent application point to at least three uses that a search engine may have for measuring a website’s performance based upon access, rendering, and user experience. They are ranking, classification, and crawling.
Ranking – The information collected about user experience characteristics could be used to possibly filter, promote, or demote web documents to improved desired user experiences.
Classification – The user experience information might be used to classify pages in some way. The layout of a page might indicate that a site might contain certain types of content related to certain types of sites. The patent application tells us:
For example, finance-related websites often display streaming data of the stock market, news websites also often stream content, and certain types of web pages might use frames or tables, which may be useful in classifying the web document.
Crawling – When a search engine has a list of URLs to visit that it hasn’t seen before, or that it might revisit to check for new content, it might consider several different things in determining which to look at first. The user experience information might help make some decisions to look at certain content on pages that a search engine might not have considered before. Here’s what the patent filing says about that:
For example, information relating to whether a user might abandon or wait for a web document to be displayed may be useful when establishing certain quality or relevance factors for the web document.
For example, information relating to whether a user might wait for or specifically request embedded or external objects to be downloaded and displayed may be useful when establishing certain quality or relevance factors for the web document.
In certain implementations, such information may, for example, be used to determine if a crawler or other like the process should also execute such embedded and/or external objects to establish performance parameters, etc.
We don’t know if any of the search engines are presently using the processes described in the patent filing presently. Still, the patent application gives us some ideas on how a search engine might use information about page load and rendering times and information on how people might react to those wait times.
How does waiting for a page to show up in your browser or render influence how you enjoy the page? Are you willing to wait longer for some types of pages than others?
Again, latency is the amount of time between when something was started and when you can see its effects.
If you own a website, how much attention do you pay to latency issues involving your site? Will you now that you know that search engines may be paying attention?
92 thoughts on “Does Page Load Time influence SEO?”
It’s a fine balance… If I go to a site that I know has rich, quality content, I’m more likely to have the patience to wait for it to load. But if the graphics/content isn’t high-quality, my disappointment will probably lead me to look elsewhere.
Personally, I’m a fan of minimalism. Sites can look great without having a ton of of graphics.
It makes sense for search engines to consider these factors, but to what extent? Again, it’s all a matter of balance…
I think it’s a factor in Google organic currently since they have pretty much come out and said that it’s a factor in their Adwords quality scores. They didn’t say how much weight they give it obviously but I think it’s definitely a factor.
Funny Matt Cutts state “not right now”
Video was posted April 28, 2009
Probably will be in the near future if it isn’t in the factors already
I think server responsiveness/latency and page download times are definitely a big part of the ranking. I’ve noticed with larger sites if the servers get bogged down during a crawl, the google bot will stop crawling and come back later – which results in less pages indexed.
You need your servers to be fast and dependable if you want it to be a trusted authority.
I think one problem with this is that latency will vary depending on the two points making the connection. So while extremely poor latency could be used as a factor, normal latency as experienced by the search engine may not equate to the latency for various dispersed users.
I’m thinking especially in terms of internationally hosted sites, and the various ways the net is hooked up. So if a bot from the US crawls a site in an Asian country, it would be a very noisy signal to base latency for the US-based bot as any sort of proxy for latency experienced by internal users. I’m in Thailand now, and while Google connects straight into the backbone here (as most large corps do in most countries) national network issues complicate issues.
Personally I’d say only very poor latency issues might be considered, if at all. IMHO of course 🙂
I wouldn’t be surprised if this became a large factor in the very near future for SEO. Overall quality of a website should be reflected where it ranks.
It’s interesting and helpful to get information from the search engines, like the videos that Matt has come out with. We do know that Google does make changes an updates to their search algorithms on a regular basis. It’s possible that Google may use something like what is described in this patent filing if they haven’t in the two months since Matt posted that video.
I was happy to see that there was more to this than just how long it took to access a page, or have it render in a browser. Actually looking at what people do when they arrive at pages, and seeing if they are willing to wait for a page to resolve makes it much more interesting. I agree though – it is a question of balance.
That’s a good point. Google does tell us that the amount of time it takes for a page to load is one of the quality factors that they look at for landing pages in paid search. It would seem that if they are interested in that there, that they would consider something like that for organic search as well. Even though this patent filing is from Yahoo, I think it gives us some ideas of things that other search engines might consider.
I think that’s an issue that is definitely worth raising. From a practical standpoint, slow servers and access times can have an effect on how well, and how frequently pages do get crawled and indexed (and updated as their content changes). It’s always a good idea to try to make a site perform as well as possible, for search engines and visitors as well. I do like that this patent takes things further. Some sites, like Amazon.com, have pages that are fairly large, and tend to load slowly in browsers. But those amazon pages also tend to render above the fold rather quickly, so you don’t really notice if the rest of the page might take a while to load. That shouldn’t affect their rankings negatively, if the focus is providing a good user experience. I like that the patent filing addressed issues like that.
Thanks – great points!
Google came out with a patent filing a few years back that looked at latency and quality of service issues over a network that was kind of odd, but interesting. See:
Communications network quality of service system and method for real time information
I’d say that the kinds of issues you point out are ones that search engines really need to be aware of if they want to consider latency as an influence on ranking and crawling. The reality is more complex than the patent filing describes, especially on a world wide web that is really world wide.
It seems like we learn about more and more possible signals that a search engine might consider everyday. Latency issues, and user experience data related to those issues may become part of how sites are ranked and crawled and classified, if they aren’t already. Even if they aren’t, I think it is worth trying to make sure a site can handle the traffic it receives, regardless of whether visitors are human or crawling programs.
Search engine optimization is all about the search engine is just trying to do IT’S job, which is to send the best and most relevant information to the user. So I would think that if a page takes a long time to load, then it may not be the best solution for the user.
Improved SEO = improved usability. If the search engines aren’t looking at load time now, they will.
We’re thinking along the same paths I believe. I think it’s a good approach to view search engines as another visitor, or user, of your web site, and work towards making sites usable for them as well.
definitely! at first, page load time influences UI, user experience and therefore – optimization status. At second, we don’t tell of analytical code at the end of a page ) At third, good SEO is about fast content delivering. Optimization of in-site / off-site parameters is the first task we do.
Thank you for creating such a great conversation with this post!
I think I’d summarize it this way so far:
1) Better speed optimizes user experience
2) Better speed certainly can’t hurt (and may help) your search engine rankings.
You should therefore seek to optimize page load times whenever possible. That being said, I’ve never seen a very public discussion of server settings one should tweak to create this optimal performance – anyone want to take a stab at that in the comments or in another post?
(I might be getting a dedicated server shortly and may get to ramp my knowledge to a much higher level in this arena shortly. 🙂 )
I think the load time of any sites page doesn’t just effect SEO but also conversion, its a known fact that people do not like waiting around for slow websites and will simply close down their browser and skip off to your competitor if they cannot view your website fast enough. So what is the point in having excellent seo and search engine rankings if you then let yourself down with a slow website 🙂
I come away thinking that site optimization does not only fall upon the designer, but the content producer as well (placing the proper tags to render images quickly, for example).
After having experienced a situation with my wife, who finally signed up for online banking, I got a good look at the UX from a different perspective. I would love to go further into that aspect of the patent. I wonder if search engines can really obtain the data to emulate UX properly? I think that there may be two many unknown factors, but it would be interesting to see how they could add this into the algorithm to determine relevancy.
I’m a strong believer in trying to deliver a quality user experience. It’s interesting that SEO is evolving in that direction, but it makes a tremendous amount of sense for it to do so.
I started coming up with a list of things that you could do to increase the page load time of a page, that you might find interesting and engaging, and came across a number of excellent references, and a helpful tool.
The best of those references is probably this one: Optimizing Page Load Time. There’s a further reading section at the bottom of the page that includes links to a number of other articles that discuss increasing page load time.
The tool that I mentioned is Yahoo’s YSlow for Firefox
I agree with you completely. Having people sit and wait watching a web page render is akin to inviting them to visit a competitor’s site.
As an aside, imagine being with the same bank for 20+ years, and finding out that you can do many of the things you do with the bank offline at their web site. You visit the web site, and they have a terrible user interface, incredibly slow pages, and confusing content and instructions. I’ve heard those kinds of complaints about many banks, and I have yet to visit my bank’s web site.
Unfortunately, the patent application limits its analysis how user experience might be measured. It’s meant as an overview, with enough detail to possibly protect intellectual property, and not enough detail so that it can act as a guide to others on the specifics of how user experience might be measured.
Imagine though, that many of the tools used to measure tht kind of user experience might be similar to tools described in other patents that indicated how rankings for pages might be iuncreased based upon information collected from toolbars and browser helper programs, from search engine query log files, from browsing and web history programs, amongst possible others.
We’ve seen patent filings that tell us a search engine might track how far down a page someone might scroll, where mouse pointers travel on a page, how guickly someone travels to a search result before returning to the search engine for a new search, whether someone saves a page or bookmarks a page, and other user actions that might measure user experiences.
Search engines are also providing analytics programs to sites, which allow their users the option of sharing their analytics data to access profile information for sites similar to theirs. Those programs provide information such as time spent on pages, navigational paths through sites, and other information that also might help provide details about users’ behaviors. Those kinds of profiles could be interesting in comparing how long people tend to stay on pages on banking sites versus how long people might stay on pages on other types of sites.
I really do think page load time and size, code/text ratio all affect SEO
I’ve been operating for years under the premise that optimizing page load time is good for visitors – it has the potential to keep them on your pages longer and have them visit more pages. If you offer something for sale, it may help increase conversion rates if they don’t run into slow loading pages that convince them to look elsewhere. Decreasing bandwidth is also a very good idea if you get a lot of visitors. Simpler code, based upon intelligently designed CSS can make it easier to maintain and update content and make changes to a site as well.
Regardless of whether search engines may rank pages more highly based upon how quick it might be to access and render a page, there are other benefits to doing so that are worth pursuing.
Even if there is no direct correlation right now, it does have an indirect correlation.
If your site has thousands of pages and it is slow, bots will tend to crawl only a limited set of data and refresh it on a limited basis. This will indirectly impact your rankings.
By the way, I like the new background shading in this comment box.
You only have to look at two things to see how seriously Google et.al. are taking site speed and performance.
1. Google Webmaster Tools has a page (Crawl Stats) with the last graph showing “Time spent downloading a page”
2. Google’s “Page Speed” project, encouraging webmasters and site owners to increase their web site performance.
If they didn’t treat load times, rendering times etc important, then I don’t think either of these initiatives would have seen the light of day.
Thanks. Some very good points. Search engines do follow a politeness protocol, and will only go through a site at a certain speed. If you regularly add new URLs, and/or change content on old ones, and you do have a large number of pages, they may not all get crawled and indexed in a timely fashion.
Google’s webmaster tools do allow you to increase that speed, but it’s really only recommended that you do so if you have a handle on a lot of the performance limitations that might make that a bad idea otherwise.
Yahoo and Microsoft do present ways to decrease crawl rates with crawl delay statements in a robots.txt file, but that won’t help increase a crawl rate for a site. You can indicate that pages get updated daily in an XML sitemap, and that may influence those search engines to index more pages, but it’s possible that if you use that setting, it may not make a difference to how quickly pages on your site get crawled.
Optimizing the performance of a large site is important if you even hope that search crawlers will visit a sufficient amount of pages on a site.
Thanks. The search engines are providing information that can help site owners on web site performance. It does appear that they have an interest in making pages on the web run better.
Both Google and Yahoo published information recently on making web sites perform better and faster.
Google has a number of articles on the topic at:
Yahoo published a detailed article on the topic – Best Practices for Speeding Up Your Web Site
I think that load time is very hard to determine the quality of a website. Obviously, I think extremely slow load times should be penalized, however minimal delays shouldn’t be a big deal. Ultimately, anyone with a website that has 1MB of pics, files & text to download on a given page needs to do some streamlining of their website most likely (unless it’s video of course).
i agree with bill,
Link no longer available
but i dont know how much KB page load will influence SEO.
I think that it might not directly affect rankings. But if a website has thousands of pages it needs to optimize its crawl equity to the best of its ability which includes serving and rendering pages fast so that that search engines can get to as many pages as possible. I’m not sure if a site has 2 pages it really matters all that much. I think it comes down to not really optimizing the load time but optimizing the crawl equity that all your inbound links have given you.
Now I do think that it should play some role in rankings and might in the future as a quality metric of a site, but as for now I don’t think the Gods have it in the cards.
Good points. Some websites necessarily need more bandwidth that others. A site focusing upon art and images is going to use pictures, and will will be slower than a site that can get its message across using primarily text.
What I found interesting about this patent filing wasn’t so much the emphasis on page load and rendering time, but rather that they combined those concepts with measuring user experience. Finding ways to increase the performance of pages in ways that provide good user experiences is a goal worth striving for on web sites. If doing so may mean the possibility of better rankings at the search engines, that can make the effort evem more woth doing.
Thanks – the Google Code page has some very helpful articles on it. Even if the ideas in this patent application aren’t adopted by the search engines, increasing the performance of a web site can mean that search engine crawling programs may crawl more pages of your site, and possibly more frequently. That can be helpful, especially if you have a very large site.
Crawl equity, as you call it, is a good reason by itself for many middle-to-large sites (and sometimes even small sites) to consider the performance of their pages. At this point, we can’t be completely sure that it makes a difference for pages in terms of rankings, but it’s possible. And, if it isn’t being used, I agree that it may just be in the future.
Page load must be a factor. I have never seen any site listed in top 10 for my searches when I surf load slow 🙂
Google must be accounting for it. Having said that I have seen some page not found errors. May be next time they index they must be cleaning them up.
It’s possible that when you visit a page in the search results, and get a “not found” message that the page was there when Google crawled and indexed it, and has since developed some kind of problem. That’s one of the reasons why Google says that they provide cache file copies of pages.
I don’t know how fast an internet connection you have, but the patent filing does mention that it might simulate different connection speeds when accessing pages, and seeing how fast they render, including dial-up connection speeds. They would also simulate the use of different browsers. I’m not sure that we can say that page load time is a ranking factor based upon the search results that we may see everyday. It is possible that the most important and relevant results for some searches are on pages that do load very slowly – should those be bumped down in rankings because of that?
We monitor our logs to how Google crawls our site – Zoomin. We notice that when Google hits a number of slow pages it slows down its crawling of our site. (Google indexes from 80-200k pages of our site a day).
From our experience speed does affect the number of pages in the google cache and ultimately traffic to our site.
Its hard to know for sure whether slow web performance effects positioning in the SERPS, I think it should. Google have confirmed website speed is a factor in the Google Adwords quality score.
We certainly receive feedback from our customers that an increase in page load speed is generally accompanied by all types of other increases; increase productivity, increase time on site, increase user interaction, increased conversion, increase page views.
Recently Shopzilla shared some interesting data of what happened when they increased the speed of their website by 5 seconds – it resulted in a 25% increase in page views, a 10% increase in revenue, a 50% reduction in hardware, and a 120% increase traffic from Google.
‘Shopzilla’s Site Redo – You Get What You Measure’ https://conferences.oreilly.com/velocity/velocity2009/public/schedule/detail/7709
So a 120% increase traffic from Google….that’s pretty compelling.
I think it makes perfect sense that load time does affect SEO rankings to an extent. There are so many factors in how pages are ranked that load time is obviously not the be all and end all of search engine rankings. However, I do believe that it is still worth decreasing load time as much as possible, in addition to other optimisation methods.
Thank you for the link to the Shopzilla presentation – I really enjoyed it. Going from a page load time of 6-7 seconds down to a time of less than a second is quite an accomplishment on a site of that size. One point raised in the video was that they figured that people were waiting at the search engine after clicking a shopzilla link, and decided to click on a link to another page in the search results during that wait. Being able to capture those visitors that they were losing seems to have made a great difference in the amount of traffic they were receiving from Google.
We don’t know for certain if page load time makes a difference in search results, but it does seem to have the ability to impact how much of a site gets crawled by search engines as well as how visitors respond to a site once they are on it. From that points alone, it’s an effort worth making.
Hi Bang Online,
I agree with you – there are many steps that a site owner can take to increase rankings and traffic, and page load time, if considered in rankings, is just one amongst many. If load time and rendering of pages can affect things like conversion rates as well, it could be an effort that pays for itself fairly quickly.
Thanks for sharing some of your observations and experience. Frequent indexing of pages can make a difference for sites that change content fairly often. Improving the performance of a site may make it more likely that a site is indexed more deeply as well.
I think pages which takes lot of time to load should be penalized by Search Engines in ranking. There are websites which takes over 2-3 minutes to load even on high speed connection, why these sites should appear on top in results. At the end of the day its all about providing the best results for the users.
I think in some ways very slow loading pages do get penalized by being slow to crawl by the search engines. What’s a shame is that sometimes the most relevant pages are ones that are slow to load. Chances are that they should be at the top of search results for some queries, but if you have to wait 2-3 minutes for a page to resolve, there’s definitely a problem that needs to be fixed.
I wish latency did effect PageRank, it would stop me clicking on links and then having to make a cup of tea whilst the site is loading up.
I’ve been involved in usability for some time and wonder why most designers take no heed of Jacob Nielsen. Probably because when a site is built in an office with large bandwidth and super fast connectivity (T1) – well, why think about the user in their house, on the outskirts of a city or in the country with their DSL connection.
Yep, let’s hope it is part of PR.
I agree with Bang , the load time cannot directly affect rankings . Seo can improve the site ranking and do the performance as well.
Caffeine update goes in this direction, faster indexing so quite logic page load time will have somehow a ecrtain influence.
Google is also encouraging webmaters to improve load time with Page Speed FireFox plugin link no longer available
Anyhow fast load pages is always good for users and you know, ‘What is good for users is good for search engines’
You would think that designers and developers would consider things such as how a page looks in different browsers and how quickly it might load at different connection speeds. But there are other concerns as well, such as security, how easy a page is to use and administrate, and maintain and update. Many sites start adding additional features over time, and as they grow, I think sometimes the addition of features may seem to be worth a site loading more slowly. Many sites also grow in size while outgrowing their original infrastructure.
Increasing the performance of a web site can have beneficial impacts, even if latency isn’t being used as a quality signal by search engines (and we don’t know if it is or isn’t at this point, though as we see in the patent, it possibly could be).
The patent describes one way that load time could affect rankings. It’s possible that other methods could be developed as well that could have an impact upon how well a page ranks in search engines.
If an SEO does their job well, they should also understand the impact that increased traffic might have upon the performance of a site, and make that clear to their clients.
It is interesting to see both Google and Yahoo release information and Firefox plugins to help webmasters increase the performance of their websites. And Google is leading by example with their Caffeine update, so we know the issue is something they consider important.
You’ve brought up some very interesting information concerning load times. It seems to be a very controversial area where everyone seems to have different answers, and as you say, we don’t even know if search engines are using this process mentioned in the patent. Since site visitors see page load times as one of the reasons to click away, I guess it’s important to keep it to a minimum anyway. Excellent post as always.
Thank you. I do think that there is agreement that faster loading pages do have the potential to lead to more page views, and less people abandoning pages because they’ve grown frustrated waiting for a page to resolve. It can also mean that a site gets crawled deeper and more frequently. The controversy is whether search engines consider page load or rendering in rating the quality of a page for ranking purposes. There are enough potential benefits to having a well performing page that it’s worth working on regardless of whether there is an potential ranking benefit or not.
I think David Dalka raised an excellent point: SEO isn’t just about creating a good page that will load quickly. It is also about using a quality hosting company that is reliable and serves pages quickly. A webmaster could create an excellent website with relevant content and fast-loading pages, but his/her efforts could be sabotaged by his use of a poor hosting company.
Does anybody have any thoughts on shared hosting versus dedicated hosting accounts? Is there much difference in load times during normal usage? We have always used virtual dedicated hosting platforms, but I’m guessing these could be susceptible to performance issues during peak times. Are dedicated servers immume from these issues?
Thank you, and great topic!
Thanks. Some might argue that SEO has nothing to do with page load time, but slow loading pages can have an impact upon whether or not someone visits a page, or leaves once they arrive.
If someone searching clicks on a search result in Google, and there’s a significant delay between that click, and the time that it takes to move from Google’s results to that page, it’s possible that a searcher might click on another Google result before being transported to that page. The video that Aptimize linked to in his comment above describes that happening for shopzilla results before they redeveloped their site to make it much faster.
I agree with the use of quality hosting completely, but it’s just as possible to have a slow loading and rendering site on a good host as it is to have a quick site on a slow host.
I’d definitely recommend spending some time at http://www.webhostingtalk.com/ if you have questions about hosting, and differences between shared and dedicated hosts. It’s a pretty good resource on hosting issues, and hosting companies.
Thanks for your insights. I do understand that latency / page loading time make web surfers frustrated (whether a site is badly coded/ a large site/ traffic/ bad host etc) and therefore google would consider it an issue. I also understand that Geolocation of the web host server matched to the web surfer is important in latency (& relevancy) & therefore google ranking (just considering the server and the web surfer are in the same country according to the Matt Cutts videos.)
What I don’t understand is would a server in say Sydney Australia rank with google higher than a server in Perth (the other side of Australia) for a web surfer in Sydney, just based on “geolocation issues with google” and not latency?
What you may be seeing regarding the rankings of search results for a searcher in Sydney seeing higher rankings for a site on a server in Sydney than for a server in Perth might have something to do with limited personalization of search results based upon geolocation. I’m not sure if I have enough facts regarding the circumstance that you are asking about to tell you that with any certainty, but here are a couple of posts I wrote based upon Google patents that may hold some ideas for you. Note that while they discuss different rankings for different countries, the first one introduces the possibility that they might do some limited personalization based upon smaller geographical areas such as countries and cities:
How Google Might Personalize Search Results Outside of Personalized Search
Changing Google Rankings in Different Countries for Different Searchers
I don’t know from your question if there are other things about those sites located in Sydney and Perth than server location that Google might consider if it were to attempt to tie them to specific geographical regions. If those sites contained content that might seemed to make the one located in Sydney more relevant for searchers in Sydney than the one in Perth, then that might possibly be a reason for the difference you’re seeing.
This is the first i’ve heard of latency, being used in anyway, in relation to rankings. Or even being recognized by SERPS. Great information, and gives me a lot to think about. Thanks
Hi Chicago Web Design,
We don’t know for certain if the User Experience approach from Yahoo is being used at this point, though it’s possible that it might be. And it’s worth thinking about.
I do know that if you have a very slow loading site, someone might click on a link to your page in search results, not see anything happen, and click on another result instead. Or if a searcher doesn’t have a fast connection, they might wait for a few seconds, and then click on their back button, or not venture any further into other pages on your site.
it’s interesting theory, I never thought about it. Time of load in a browser influences on Bounce Rate – many people don’t like wait. I heard that Bounce rate can also influences on SERPS, but it’s also only theory. Maybe somebody tested this, but I don’t know nothing about something like that.
Not necessarily a theory – Yahoo really did publish this patent filing that describes how they might consider things like how long it take a page to load, to render, and how user interactions with pages may influence rankings for those pages.
I did use the patent application as a introduction to what I hoped would be a discussion of other ways that latency might influence the rankings of pages, and there have been some great comments and responses to the question.
Bounce rate may not be a good ranking factor for a few reasons. One of the most important of them is that a page may provide the perfect answer for a searcher’s query, and that visitor may be completely satisfied with what they saw on the page they visited, and left the site after seeing it. That’s a bounce, but not something that should negatively impact the rating of a page.
Very interesting article, it has been thrown around for a long time, however I’m not necessary convinced and even if it did influence a site’s SERP it would only be a very small influence. Referring back to Adi comment I would be very surprised if bounce rates effected the SERPs, like Bill said a high bounce rate isn’t always a negative thing, plus there are a number of one page (jquery influenced) websites that rank very highly, mainly to do with the amount of content on that one page.
Hi SJL Web Design,
Recent news from Google says that they will be paying much more attention to page load time in the future as a ranking signal. I would assume that it would be considered a quality factor, since load time shouldn’t impact the relevance of a page to a query, and that it would likely only have a small influence. But sometimes the differences between positions in search results may not be as great as you might think. Small differences could still make a difference.
I believe most search engine will improve their algo when crawling a site to be more similar to what human think and feel. I’ve read many SEO tutorials that encourage site owners not to use too many scripts, flash and images as these objects will slow down the loading page. And I agree cause I myself will straightly click the back button or close the tab if a webpage loads too long just to display a flash content.
Thanks for giving me more knowledge, Bill. 🙂
You’re welcome. One of the things I found interesting about this was that it doesn’t just look at how quickly or slowly a page loads or renders, but also pays attention to how people actually use a site as well. Some sites require more images, scripts, and other features that might require longer loading times than others. And the patent application recognizes that, and looks at User Experience information as well as other factors.
I see. It sounds fair enough for such sites.
Do you have any predictions about other human-factors that search engine will take into account when do their crawling Bill? I believe they improve their method day by day..
The patent gives us some examples. Here are a couple.
For instance, if a search engine is paying attention to the browsing history of people visiting a site, and it notices that people spend time clicking upon and waiting for extremely large images to load in their browser, a search engine might decide that those images are worth visiting and indexing even if they are very large. Without that information, a search engine might balk at indexing some very large images.
If a financial services site contains an application that takes a long time to load, and most visitors to the site tend to stay around regardless of the loading time, the slowness of the pages might not be considered a negative signal because the behavior of people on those pages indicate that they find value in waiting. In that instance, the slow loading pages might even be seen as a positive ranking signal because of that associated user data.
I’ve got a client who has major server downtime and slowness issues. I’m pushing to get that resolved, but its been an off and on thing for probably 6 months. We’ve noticed that around when the issue seems to be real noticeable their rankings in Yahoo will slide – pretty much every keyword down between 1 and 10 spots (most are 1 to 5) – fairly universal and not specific to a given interior page, keyword group, etc. Have you seen this sort of thing before? We’re also seeing a lot of 1 position drops in Google, but not nearly as universal as with Yahoo. With Yahoo is like 50 out of 60 keywords we’re tracking.
Good to see you. I’m sorry to hear about the problems that you’ve been having, but I appreciate your sharing your experience.
I haven’t had to face too much in the way of ongoing server problems influencing search results over a long period of time. Moving hosting can be a little challenging, but it can be a very good step to take quickly if problems don’t seem like they will be resolved in a short period of time. I also try to warn clients that effective SEO may mean moving from the hosting they have if it can’t handle increased levels of traffic.
I have seen positive effects from addressing page load and rendering issues, and try to include recommendations to improve a site from those vantages. Simple things like compressing images and including heights and widths for those images, enabling something like gzip, using a subdomain or CDN for images, moving page layout from tables to CSS, and others can result in more pageviews, more conversions, more visits, and even more links pointing into a site.
It’s hard to draw a direct correlation between optimizing page load and higher rankings, but it helps to hear of experiences like yours where downtime and latency issues may be a cause for reduced rankings.
If you haven’t had a chance to see it, there’s a video linked to in the comment above from Aptimize that describes Shopzilla’s redesign and the impact it had on their search traffic. It’s definitely worth watching.
I think a lot of factors effect the ranking of a website, such as thise you have mentioned here, the content of the site, links and texts in those links etc. But I think page loading should also be good, some websites take ages to load, which should not be the case. Even youtube doesn’t take that much time to load, whereas it contains videos on its site, so page loading should be considered for ranking a website with the rest of the factors.
Right, that was the point of my post above – a search engine could consider page load time, and the amount of time it take for a page to render as a ranking signal. The patent I wrote about discusses what Yahoo might look at, but it probably isn’t much different than what Google or Bing would look at as well.
Google announced on April 9th, in their post on the Google Webmaster Central blog, Using site speed in web search ranking, that they would start considering how quickly a page loads as one of their ranking signals.
Great info here and everything I’m reading seems to suggest it’s going to be an even bigger factor in 2011. Some other tips on this area are in this article below
Great article, I 100% agree here. I had 1 client who had a heavy music player, and after much encouraging they agreed to remove it. within weeks of removing (much less load time) I saw a nice climb in the ranking of almost all of the sites keywords.
I’ve seen improvements in rankings for keywords on sites where the speed of the pages was increased tremendously also, though it’s hard to tell if increasing the speed of the pages directly impacted the search results, or if that was a result of more people visiting more pages (because they were much faster) and people weren’t abandoning the pages when they were slow to open when clicked upon at the search engine search results pages.
Now we know google have said that this is a factor its definitely something all SEOs should consider, but in reality I can’t see it making a huge difference unless you had some serious latency issues. If you page loads in a reasonable time then I would say your ok as far as SEO goes, as for users – the quicker the better.
Google also said that page speed likely would only impact a small number of sites. But, page speed is one of those areas that site owners have the most control over when it comes to their sites, and slow pages on a site can cause visitors to leave.
Also, there still are a lot of people who access the web through dialup accounts, and a site that might seem to load quickly on broadband may be much slower to people connecting via their phone lines. If a site loads slowly, it could possibly impact rankings. But regardless of search engine rankings, it could be hurting how many pages visitors view, and how many conversions you receive.
It would make sense if sitespeed only becomes a ranking factor above a certain load time. Nobody cares if a site loads in 0.4 or 0.3 seconds. If you see some interviews with Matt Cutts that also seems the case. He also seems to say that sitespeed only becomes a factor if two sites ranking factors would make them rank somewhat similar. Then sitespeed can be the determine factor.
But in the end does it really matter if it is a ranking factor? Everyone with a self respecting site cares about it’s users and a slow site can really hurt the user experience.
Regardless of the implications for search rankings, there’s value in providing pages that load quickly, and don’t require visitors to wait. That should be enough by itself to motivate developers to make pages quicker. 🙂
I don’t think the question is if the load time influence SEO, it should be in what percentage it affects compared to other factors as keyword density or links. I’d probably make content 40%, Links 30%, HTML 15%, Page load time 7%, Others 8%. What do you think?
It’s really hard to offer any kind of breakdown on rankings and the percentages that might impact those rankings because different sites, and different types of sites may be treated differently by Google. For instance, news pages may be ranked in part based upon how fresh and novel the stories they contain are.
Sites that provide great content for little-searched-for queries might not need as many links pointing to them to rank well for those queries as pages that might rank well for more competitive terms. We were told by Google that page load time might not be much of a factor for most sites that load fairly quickly, but could possibly negatively impact pages that are very slow loading.
I’d definitely recommend for most sites that they begin by covering all the basics well, such as making sure that all pages are linked to by at least one text link, each page has a unique page title that describes it well, and a meta description that describes the page well while also being engaging and persuasive, etc. Building a good foundation for SEO is a first step, followed up by developing content that people would be interested in linking to and referring others to, and finding ways to promote that content.
Well it’s fair enough to say that the percentage might vary for every single case but coming from a Maths background I like putting everything into numbers.
Do you mean that the time of loading is a hurdle? This is actually one of the factors that the webmaster can control the most and I believe that Google has realised that is one of the quality factors and therefore is giving it more and more importance now.
I understand the desire to put those types of things into numbers, and it’s possible that it might make it a little easier to do so, but the reality is that the numbers that you would have to use are complex enough that simplifying things to that level might not be helpful.
For example, if we just look at anchor text along, the value that it passes along can depend upon:
1) The relationships between more than one site that use the same anchor text but might be related in some manner, so that the weight of relevance each site passes along might be reduced.
2) The relationships between more than one site that use the same anchor text but aren’t related in some manner, so that the weight of relevance each passes along is a larger amount.
3) If the anchor text used is “related” to the content found on the page being linked to, so that it is given a full amount of weight.
4) If the anchor text used is neutral or completely unrelated to the content found on the target page, in which the weight of relevance passed along is reduced or possibly ignored, at least for a certain threshold of links. For example, the Adobe Flash download site ranks very well for the query “click here,” even though those terms aren’t found on the page and it’s questionable how much the content of the page is “related” to that query. But there are so many links that the anchor text is being counted.
There are other potential factors that could cause that anchor text to weigh more or less as well, and some of them may rely upon certain thresholds or confidence levels being met.
When we look at search results, we sometimes see very relevant pages with lower pageranks outranking less relevant pages with higher pageranks. Those relevance and popularity measures are combined in some manner to provide the final rankings, and the relevance and popularity for each of those results may differ dramatically, but combined (possibly along with many other signals) we get a final ranking.
As for speed of loading or rendering, it is one of the factors that webmasters can control, but it’s likely one that only impacts a very small percentage of sites when it comes to rankings. I would suspect that Google would much rather show a result for a site that is a little slower but much more relevant than one that is much less relevant but loads quicker.
I never thought that search engines would actually be effected by load times, obviously if a page when’t up and down and happened to be down when a web crawler from a search engine was visiting then it would effect it. But if your site is slow enough to effect SEO, I think you have more problems to worry about. Nice article, something new 🙂
If a site is slow enough that SEO might be effected, then it probably does have some other significant issues as well. It won’t have as many page views as visitors lose patience, it might fail to convert sales of goods, of services, of signups to newsletters, it may not resolve quickly enough when someone clicks on it in search results or on a link from another page, and the potential visitor may decide to click upon another link. It might not attract as many links as it could and potentially should, from other sites. I could probably go on with other negative impacts as well. 🙁
Comments are closed.