Will Content Change on a Page Lead to Changed Search Engine Rankings for that Page?
Tomorrow the footers of a great number of websites will automatically change to show a new copyright date. Others will wait for site owners to code the change manually. It’s a change worth making because it shows visitors that the sites are maintained and up-to-date. However, as changes go through, it’s a fairly insignificant change, and likely won’t influence search engine rankings of pages. Many pages on the Web change in minor ways every day, including updates to visitor counters, subtle changes in formatting, and new advertisements shown on pages.
Many other web pages change in more significant ways regularly, from blog home pages that show new posts, to news media sites that might add new storylines every 15 minutes, to social sites that constantly change as multitudes add updates.
How frequently a search engine crawler might visit a particular page on the Web can depend in part upon how often content change happens on a page. For example, a news site, updating every hour might have Googlebot or MSNbot sniffing around hourly to devour new content.
It might be an easy assumption to make that when a search engine crawls and indexes that new content, it’s only looking at the content that exists at the time of a visit, and not accounting for how much change has actually happened on a page since the last visit. But what if search engines pay attention to the frequency of changes to pages, and record the amount and type of content that changes? How might that matter to search engine rankings?
What if a search engine tracked these changes, and the changes themselves influenced rankings?
A recently published patent application from Microsoft refers to the tracking of changes in documents over periods of time as “Temporal Dynamics,” and looks specifically at content change to things such as:
- Terms included in or associated with a document,
- Anchor text in a document,
- Colors and sizes of images,
- Tags assigned to documents,
- The positions of text or images,
- Queries used to retrieve the page,
- Amount (volume) that a document changes over time,
- Frequency/rate that a document changes over time,
- Nature of changes made to the document,
- Other changes that may occur over time
One place where this information might be used may depend upon whether or not a query used in a search is deemed informational or navigational.
An informational query is one where there may be an intent by a searcher to find information on a topic, such as “How do I add drop shadows to words using CSS?”
A navigational query is when someone is searching for a particular page, such as typing “Hilton” into a search box to find the Hilton Hotels homepage.
When someone is looking for information about recent events or something fairly new, it might benefit a searcher for a search engine to show a page where new terms have been recently entered into the vocabulary of the document. On my example informational query above (“How do I add drop shadows to words using CSS?”), that might mean that a page that recently added the phrase “CSS 3.0” might be boosted in search engine rankings.
For navigational queries, we’re told that a page with content that hasn’t changed substantially over a period of time might be boosted in search results. However, I’m not sure how well that works with news and media sites where the content changes regularly, such as a navigational query for ESPN or NYTimes, and the patent filing doesn’t seem to address that issue.
The content change patent is:
Assigning Relevance Weights Based on Temporal Dynamics
Invented by Susan T. Dumais, Jonathan Louis Elsas, and Daniel John Liebling
Assigned to Microsoft
US Patent Application 20100325131
Published December 23, 2010
Filed: June 22, 2009
A system described herein includes a receiver component that receives the first dataset, wherein the first dataset comprises temporal dynamics about a document that is accessible by a search engine, wherein the temporal dynamics comprise an identity of a term corresponding to the document and an indication that the term has been subject to change over time. The system also includes a weight assignor component that assigns a relevant weight to the document based at least in part upon the temporal dynamics about the document, wherein the search engine utilizes the relevance weight to assign a ranking to the document concerning at least one other document when the search engine retrieves the document.
The patent provides a much more detailed description of a process that could be used to track content change on a web page and use it to influence search engine rankings.
What I found interesting wasn’t so much Microsoft’s process itself, but rather that they might capture information about changes to a page, and possibly use that information to influence search results.
Add a new keyword phrase to a page’s title and heading and a sentence or two of the content, and a search engine tracks those changes. Add a couple of new pictures to an old blog post, and the search engine makes a note of it. Completely rewrite the content of a page a couple of times in a short period, and the search engine might decide that the page is more appropriate for informational queries than navigational ones.
While this is Microsoft’s patent, it might make sense for Google to carefully consider how pages change over time as well and influence search engine rankings. It seems the search engines may not just look at a webpage as it exists today but may have a memory that considers what a page was like in the past, and how it may have changed.
I could see change information about pages being used in many ways by a search engine that goes beyond deciding whether or not a page is more informational or navigational in nature. For example, changes to a page might signal a change in ownership of a site, a possible intent to spam, an attempt to provide new information, make a page fresher, and other things as well.
When you make significant changes to a web page, what signal might you be sending to the search engines?
Oh, and Happy New Year – may all your changes over the coming year be good ones.
Added (12/31/2010 at 12:09 pm): A couple of days ago, I received an email offering me $125 to add some text and a link to a commercial page on a six-month-old blog post on another blog. How much would a change like that stand out, if a search engine not only indexed a page according to its present state but also looked carefully at a history of changes to a page?
68 thoughts on “The Impact of Content Change on Search Engine Rankings”
The Â£125 for a deap link is something tried and tested.
From personal dealings with three major UK newspapers I can tell you they sell links from deep, old pages on a short lease (2-4 week) tenancy basis, and remove the link once the page has been recrawled (thus providing the crawler with the new signal).
Don’t want to name them right here, but lest say we are talking both tabloid and broadsheet.
These links were bought via an agency in England.
It is interesting to think of navigational and informational sites based on the aged v recent content.
Not sure if it makes such a difference, is hard to tell.
Anyonw have any examples of aged, unchanged content versus dynamic content?
Bill – I thought the most interesting you said was that “It seems the search engines may not just look at a webpage as it exists today, but may have a memory that considers what a page was like in the past, and how it may have changed”.
If your assumptions are correct then this would bring on a whole new dynamic in search engine optimization. Reading this post has made me more conscious of how I change the content on my web pages because I’m not sure if the signal I’m sending to the search engines is a good one.
But then again, who doesn’t make changes to the content of their pages every now and then?
Recently, I have been going back to rewrite or update informational content on a few pages and posts. One of my posts had ranked very well in the search results for a particular keyphrase, but I knew that the post was not the best source of information, so I was not surprised to see it slip down in the SERPs. Earlier this month, I came across some new information, so I updated the post. The post was bumped up in the results after that fact, and I could only see that change as the reason. I guess this would be a method to judge the integrity of a site. Would a spammer update posts or pages from two years ago?
Funny thing is, I visited the site of the company that sent me the email, and they were presenting what they were doing as a revolutionary new approach to helping sites rank better in search results.
I was surprised to see the patent veer off into a discussion of how things like the frequency of change on a website might be useful in determining whether or not a page might be more appropriate for a navigational query or an informational query. It left me wondering how much testing and research went into those assumptions – I’d like to see it.
Change is inevitable on most web pages, but the patent made me question what a trail of changes that I might make on pages I’ve worked on might signal to search engines as well. It is potentially a new dynamic that I’ll be keeping on mind going forward from this point on.
I see some people offering to tweak pages on a continual basis for a monthly fee. I think changes contribute to site freshness and helps boost rank a little bit. I think is is a good idea to freshen up old pages but only if you can add some solid enhancements ( aside from copyright, that is ) to the content.
Happy New Year!
PS – This kind of information is really good because it confirms value and sometimes it is hard to convey value of certain activities to customers. Thanks.
This is a very interest detail you have unearthed.
What you mentioned here really intrigued me – “Changes to a page might signal a change in ownership of a site, a possible intent to spam, an attempt to provide new information and make a page fresher, and other things as well.”
I have a nasty habit of making changes to past posts and pages. Sometimes I add more paragraphs, rewrite huge chunks of text, and even add/del links in the posts. From reading your article here, my challenge becomes how to signal to the engines that it is a good change instead of a spammy change.
What’s your take on that Bill?
I agree that the quantity of change does signify the dynamic nature of the page in time. This can be achieved by several means, including RSS feeds on the page especially if those feeds change constantly. So static information pages on a website can be made to appear dynamic by allowing some of the information to be constantly streamed. I do this on the home page of my site which contains my blog above.
Paid insertions would be dead if future algorithms will consider this system. Small changes that are done on a page can definitely affect SERP rankings and tracking those changes might be a bigger help for SEO rather than CRO, and if this were to happen, I would really enjoy doing a lot of A/B testing just to try it out.
I can also vouch for JC’s post as I worked in a link ‘clearing house’ and placing hundreds of links for blue chip SEO campaigns. These links are always placed in deep old content pages within large consumer media websites: glossy magazines, tabloids and broadsheets.
Absolutely stunning results to traffic improvement. The publisher sites allowing this see it as a commercial deal, but none demark the links within an ‘Advertisement’ container.
Big agencies are doing this — not just in the UK — the agencies I have done this for are global specialist SEO agencies, the ones who publicly frown on link buying.
I believe the crack down on paid links will become more aggressive in 2011 with the search engines setting up honey traps and actively mining SEO communities (white and black hat) to build relationship maps.
I found your advices very useful. I’m used to add fresh content to my websites almost every day and from their stats I can see that googlebot seems to like it as it come lot of times daily.
I’ve always considered gradual changes to have some impact on rankings. After all, a page that is about cakes today and other herbal supplements tomorrow surely will raise a few flags. 😉
That said, I think that this kind of idea would be great for blogs or other informative pages that may become popular over time, spark discussion and then no longer be relevant and subsequently no longer needed.
Interesting find and nice write up once again.
very interesting article. Last year at Chicago SES show, i asked one of the speakers if we can just have 1-3 different versions of same content/page and just switch it every few days to make it look like we had new content and the speaker said yes. But after that i was thinking how can Google not know the difference?
This adds an interesting take on page evolution SEO and content sophistication. I wonder if they will remove brownie-points if new content is worse than old content? this brings up temporal contextual issues on top of proximal and adds a new dimension to relevance. Will this help Google’s aspirations towards relevance? I have noticed better results on the first two SERP in the past week or so.
Hmmm…I have always believed that the search engines incorporate change frequency as part of the relevance calculation. Making content changes to a page signals that some thought-processing-biped went to an effort and that must mean something. How much it means, we’ll never know as they don’t tell us that or how many data points are now feeding into relevance. I am fascinated that placing random links on deep old pages has any impact on search results visibility considering the pervasive use of Hilltop and HITS by the major search engines. Then again, we really don’t know that they do because the search engines aren’t talking except for not-so-veiled threats from Matt Cutts about buying links. Hmmm…
I have read in several SEO blogs that all things equal, the fresher content is ranked higher in search engines. hence it is always better to update your pages as far as possible.
From my own experience, pages without any changes have no problems ranking high and staying there!
Same with pages which update content often.
However, I think this factor has to be accounted within others. To isolate it might send us on the wrong track. It’s one element amongst others, but probably not the most important.
Title + Backlinks with right anchor text remain the must for me.
Best wishes for 2011.
I’ve gone back and updated some posts with new information, and saw some similar results as well. It’s possible that part of the resurgence of those posts in rankings was that they were more relevant for the queries, or that the freshness of the updates help also, or perhaps even both factored in.
The Microsoft patent suggests that adding new vocabulary to old posts, that evidences that your content is more up to date can have an influence on rankings, and I think I agree. It’s not just that the page has new content, but also that it may use terms or phrases that are now more “popular,” like in my CSS 3.0 example.
A spammer might purchase a site that’s been around for a while and update it by adding links from its pages to other pages that they might want to rank well. Or they may pay someone to insert a new link into an older page that may have a fairly high pagerank, or tend to rank well for a particular term or phrase. Those activities aren’t quite the same as what you’ve described, and it’s possible that those types of changes (adding new “commercial links” to old pages) may send a signal to a search engine about an attempt to manipulate rankings.
You’re welcome. I’m not sure how much value “freshness” by itself has to the search engines. As you note, I think it likely helps when something substantial may be added.
Thanks for the New Year’s wishes. I hope you had a happy new year’s as well.
A good question, and if you read through the patent filing, they really didn’t go into much detail on how they might view different kinds of changes except to exclude simple changes like updated visitor counters, new ads, and formatting changes.
I would guess that most of the changes that you might make that improve the quality of pages would more likely be deemed positive, while doing things like adding scraped content to pages, inserting commercial links, and similar changes may be seen as less positive. Most of the focus in the patent though, wasn’t so much about whether or not changes would be used to identify spam, as much as noting that a frequency of changes to a page might indicate whether it was more informational or navigational in nature.
So, if you’re adding new content to blog posts, for instance, and there’s new information so you add a few “update” paragraphs, that may be a positive thing.
I have run into a good number of sites that have added an RSS stream to their pages to give it an element of change and freshness, and I think that can be an effective approach. It’s something that is often worth experimenting with, to see what kind of impact it might have.
Testing is a great thing – not only for search rankings, but for seeing how visitors react to the content and layout of your pages. The patent itself is from Microsoft, but it’s quite possible that Google could use a system like this as well.
Regularly adding fresh content to your pages is definitely a part of getting Google to come back to your pages to crawl and spider them on a regular basis.
Thank you, Robert.
I’m a big fan of regular gradual changes if possible as well. And I do like to look at old pages here, from time to time, and make some updates if appropriate – it keeps the content fresh and useful to people who visit. And if that can help with the rankings of those pages, all the better.
Thank you for sharing your experiences with us. I think search engines are going to get more agressive in 2011 with paid links as well.
I’m not sure that I would recommend rotating the content of a page in a manner like that on a regular basis. There are other ways of adding fresh content to pages that might have better results. For instance, if you have an ecommerce site, one possibility is to rotate a few new “featured” products on your home page every week or so, so that search engine spiders are more likely to visit those deep pages in your site as well – fresh content + deeper indexing.
Interesting questions. If the new content is a little lower quality, but it includes terms and phrases that are presently trending, it’s possible that you may still see a positive boost. The patent raises a lot more questions than answers, which isn’t necessarily a bad thing – it gives us something to question and possibly experiment with.
Change frequency is definitely something that the search engines have been watching in terms of how often they should come back to visit and crawl pages of a site, but it’s less clear how closely they are watching those changes and their frequency to determine whether or not to boost the rankings of pages. One of the reasons that I spend so much time with patents is that they sometimes raise questions like this when representatives from the search engines are quiet on the topics.
Freshness can often be a positive thing, especially in news articles and blog posts, but it may not always be a positive ranking signal. I’ve seen mentions in a number of patent that suggest that for some queries, older content may be preferred. For instance, if someone is searching for “the declaration of independence,” older pages may rank better than newer ones. I’ve even seen the suggestion that Google might look at the average ages of the highest ranking pages for some terms, and if those all appear older, then it may consider “older” pages to be preferable for those terms than fresher ones.
Good points. It’s possible that factors like frequency and amount of change have the most impact when those things are extreme. When changes are minimal, or somewhat insignificant, they probably won’t have much impact at all.
In my opinion change content can both improve or deteriorate rankings pages in search results !!
In fact, make Â« majorÂ» changes (titles, links) on page, that’s periodically updated, allows it to improve their rankings for more keywords,
However doing the same on page infrequently update, could be considered by search engines as spam
best wishs for 2011
The advice i always give, is be natural, if you need to make changes, make changes, if not, then don’t.
People tend to overthink the freshness issue, new content can harm as much as help, it depends on your niche. Like Bill’s declaration of Independence (we wuz robbed!) example, old content can sometimes be viewed as more credible.
In my opinion the fresher the content the better really. We have a subscription to a directory/online magazine. I can post on their website about one of our products and we can appear on the first page of google a couple of hours later. You then gradually slip down over a few months, but a quick update with some new content fires you back to the first page. Its very effective!
The freshness of content is a point everyone will agree. Another is that killer and unique content. My site, for instance, in in Brazilian Portuguese, and I’m preparing some translations from unique texts (I asked permission from the authors first) that are not available in my language to publish in my blog section.
This part of the post was kinda disturbing, especially if you are running a site say like a store where the content wouldnt change much:
“But what if search engines pay attention to the frequency of changes to pages, and record the amount and type of content that changes?
What if a search engine tracked these changes, and the changes themselves influenced rankings?”
Wouldn’t it put sites like blogs and news at an advantage over your current site, even if you were dominating the results? It seems that maybe an identifier of some sort should be used: news site, blog site, store site etc. To level the playing field a bit. Not all sites are created equal and someone who has the luxury to pump out as much fresh content as they like versus someone who has their hands tied and can’t really do anything with some of the pages. The fresh content guy would seem to be holding all the cards.
I am just generalizing here, but it can be kinda scary depending what kind of site you are running.
Interesting post on SEO Roundtable about Google outreach to webmasters playing fast and loose with link-building and cloaking [http://www.seroundtable.com/google-unnatural-links-warnings-12761.html]. The bottom line for me is that search engines are getting smarter and increasing dictatorial about what influences placement. Tactics like random link building and optimizing for keywords are becoming decreasingly effective. For me, effective search engine optimization starts with client education about how search works, research on user behavior specific to client products/services, development of appropriate core metadata for client products/services that becomes a foundation for content strategy, link strategy and even social media strategy and benchmarking that provides quantitative and qualitative metrics on success and guidance for future actions to build on the success. Bill’s work with patents has inspired me to do the same and this has proven invaluable in the effectiveness of my efforts.
@Beezid: I wouldn’t worry too much about that. Volume or freshness of content is no doubt as subjective as any other ranking factor. A 3 page website can potentially always outrank a 1000 page website that is updated regularly, if (and that’s a big IF) it’s a better match for a particular search.
Where I feel this will make the most difference is if someone does a search for something that’s a trending topic. Lets face it, a website about Britney Spears that hasn’t been updated since 2002 isn’t likely to be relevant to any scandal news that’s related to current searches. But a single page that was setup in 2001 might be a better match as Mickey Mouse Club biography.
Fresh or quantity of content isn’t always an indicator that a website is quality or worthy of ranking. We know that The search engines get it wrong (and more often than they would like) but they are continually working on it.
It’s interesting to see how people from the search engines view content changes.
The last time I wrote much about content change, and possible impact on search results was in 2008, in the post Updating Google’s Historical Data Patent, Part 2 – Changing Content, about Google’s approach, which was much more deeply defined. According to them, changes over time can have positive or negative impacts, too.
Thanks for the positive wishes for 2011 – hope yours is a good one too.
I updated a post from 2005 a few weeks ago that had been ranking well for its main term for almost 4 years, and then had slipped a few pages recently. Within a few days of my update, the page moved back to within the top 3 results for that term. The changes primarily were adding new images to the page, rather than adding new text.
Good advice – be natural – make changes because it makes sense to do so, and not just because you think fresh content will help a page rank better.
This is from the post I mention a couple of comments above:
Freshness could help, or it could hurt, and it depends upon the specific query in question as to whether it might or might not.
I’m not sure that there’s as much agreement over freshness as you might think. I see a lot of sites mention that search engines like fresh pages, when in truth sometimes a page with more mature content is preferred.
Unique content is great, but if it’s on a topic that few people search for, then it might not be as effective as you might hope.
Hi Beezid and Robert,
I think Robert’s response is 100% on target. If someone’s searching for a pentium II computer, then a page with a lot of changes, and information about Windows 7 and multicore processing systems probably isn’t going to be boosted in search results on a search for [pentium II computer].
Thank you. There are so many options available to someone performing SEO to follow, beyond spamdexing behavior like manipulative linking and cloaking and keyword stuffing, that those things aren’t necessary and are both very risky and potentially very harmful. Google’s warnings may in some instances alert site owners who might not be aware that those tactics are being used on their sites, and they might be warning shots to those who are aware. Hopefully the messages have a positive impact, and help to reduce that kind of behavior.
How about viewing historical serp position changes via the ole way back machine within the parameters of an extremely limited – non client test for fun. Seems big G’s been hostile over blocking ones site from being archived for quite some time.
An SEO strategy that has worked for me consistently in the past is reworking old content. If you have more static pages, update them once every 6 months. Even adding a new “this page is outdated” image with alt text will work wonders, then redirect to a more current page with related information. I’m very certain that search engines have stored memories of cached pages somewhere and look for changes. Traffic to those reworked pages always doubles at least.
Awesome article, thanks for the info on informational query and navigational query! I never really thought about different ways google may give results based on informational or navigational
The wayback machine just doesn’t provide a way of looking at old search results from the search engines. Since those are partially generated on the fly, it would really be difficult to archive them as they are.
Hi New Jersey,
Thanks for sharing your experiences with us.
As I mentioned in one of the comments above, I had a post here that had been ranking pretty well in Google’s search results for a number of years, which had dropped considerably in rankings in the past 6 months or so. Anticipating a followup post sometime in the near future, I made some formatting changes, in addition to adding a number of pictures. Within a few days of that change, the page started ranking pretty well again.
Thank you. The search engines are looking more and more at the intent behind searches and how relevant a page might be for that intent. And they seem to think they have a handle on identifying the intent behind at least some queries, and whether there might be a navigational intent, a transactional one, or a navigational one. They also been attempting to determine if there is a local intent as well – so, for instance, if you search for “pizza,” Google may think that you’re trying to find a local pizzeria, and may show you local search results, too.
i have had success in reworking old content on my sites rewriting soem stuf and adding in pictures etc. it really seems to impact the freshness of the page which i think is probably a ranking factor
Great research Bill, i’d be careful with changing older posts if these are ranking independently for a particular long tail search phrase. It may hinder rather than help in this particular case.
It sounds like you’ve shared an experience that others commenting here have as well. I’m not sure that “freshness” is always the reason fo higher rankings for all types of queries. What this patent seems to be telling us is that for some queries, a lack of frequent changes might make a page rank higher. For other types of queries, more frequent changes may also make a page rank higher. The difference might be things like whether a query is informational or navigational in nature.
Good point. I think it’s a good idea to have a sense of what words and phrases an older page might be ranking for before you make changes to that page. It doesn’t hurt to make as informated a decision as possible.
“iâ€™d be careful with changing older posts if these are ranking independently for a particular long tail search phrase. It may hinder rather than help in this particular case.”
I find myself in just such a situation. I have a client who ranks #1 in our country for a long term search term solely due to the (intelligible) usage of the three words in the body text in the precise order of search and was wondering if I should simply keep the content up there indefinitely and take the kudos of a #1 for a term (though with near zero volume) is a great feather in the cap of the company I’m working on behalf of as it is “offline” topical if you get my drift…
As a guideline for what to tell clients who ask, I like @Mark’s suggestion to “be natural, if you need to make changes, make changes, if not, then donâ€™t.” I also ask clients what value it brings to a visitor of the page. If they can’t tell me, then I don’t recommend the change. After all, aren’t all these patents aimed at identifying the best content?
I believe the major search engines have separate algorithms whose job is to best interpret the query intent, including whether they want current or old information. We just (!Ha) need to guide our clients in providing the best possible answers. One way is the “what’s the value” question above. Another is “who’s your audience” and “what are they looking for”? Writing 101 perhaps, but I know I have trouble remembering…
Thanks for another good post. You always stimulate discussion.
I’ve been there.
If the term isn’t bringing much in the way of visitors that are likely to convert, it might be time to make some changes. If the client values the high ranking for the term, and it’s possible to make changes while keeping that page ranking for the long tail term, that would be the ideal situation. If it isn’t possible to keep that ranking, it might be time to have a conversation with your client, and let them know that you think it might be best to refocus the content of that page to bring them more qualified traffic that’s likely to meet their business objectives.
Thanks. I agree with you on this approach:
When it comes to search-related patents, search engines often have a few (sometimes competing) objectives. They want to make a search engine more efficient and/or faster (like Google Instant or Google Caffeine), they want to return the most important/relevant results (like PageRank), and they want to bring back the most results for a search that they can (For example, showing results for synonyms where appropriate).
I also agree with you that there are different elements within the search engine’s algorithms that try to determine the intent behind a search. When someone types in [Pizza], the search engine might interpret that as a search for a local pizza place. When they type in [ESPN], the intent is navigational – the searcher wants the ESPN homepage. When a search is [how to knot a tie], the search is informational, and the searcher doesn’t care which page is returned as long as it provides useful information.
Thanks Bill. I think I can, for the foreseeable future probably get away with keeping the text as is and enjoying both the superficial #1 and simultaneously chase higher volumes elsewhere. Have already started attacking the short-tail for this client with promising results at this stage…
That’s probably not a bad approach, especially if your client really likes ranking for that term.
thanks for your reply i really appreciate.
your article Updating Googleâ€™s Historical Data Patent, Part 2 â€“ Changing Content, is one of the best article i have read so far about content change.
The historical data patent was filled with a lot of ideas about how a search engine might look at changes on websites.
Does this mean that bloggers have a slight advantage or website oweners who offer a basic service which doesn’t change i.e a hotel chain? I would imagine that because blogs are always changing, by nature they would seem t have an advantage
I think some types of queries actually favor content that changes and is updated on a regular basis, while other types of queries may benefit best from content that remains relatively static over time. As to who benefits, those are the people who can figure out which queries might benefit by which approach. 🙂
I read your article and you don’t believe some points which you describe, i also think like that.
but still i have a query, suppose i have software company site and i change content quarterly according to you so is it effect on rank? my keywords are already ranking so plz suggest how i can improve on same..
You shouldn’t just change content on the hopes that your updates will make you rank higher in the search engines. If you make updates because your software addresses an issue that people have increasingly started expressing a concern about online, that would be a good reason to make that update, and possibly increase traffic to your pages. If you make updates because the content on your pages is dated, and your software has changed, that would be another good reason to update.
But, if you were selling software, and now you’re selling home furnishings instead, on the pages where you were selling software, that type of change might be harmful to your rankings.
Comments are closed.