Predicting SEO Changes in Rankings, Algorithms, and Penalties

Sharing is caring!

Last Thursday, the Wall Street Journal published a couple of articles that point to a new direction in the future from Google, With Semantic Search, Google Eyes Competitors, and Google Gives Search a Refresh. On Friday, Barry Schwartz reported at Search Engine Land that Google’s Head of Spam, Matt Cutts announced that Google was working upon an “Over Optimization” penalty for websites that were stuffed with too many links and had excessive links pointed to them, in the post, Too Much SEO? Google’s Working On An “Over-Optimization” Penalty For That.

Thursday evening I visited the Philadelphia offices of Seer Interactive to give a presentation on some of the changes in Search and Social activities involving SEO in a free presentation hosted by Wil Reynolds and the Seer Interactive team. Amongst the possible changes I pointed out included more emphasis on search as a knowledge base, with more Q&A results, and a greater emphasis on information extraction around entities as described in the Wall Street Journal article.

There was a nice turnout and the hospitality of the Philadelphia search community was tremendous. Seer Interactive is hoping to help Philadelphia become better known as a centerpoint of the search community, and with Seer’s Search Church office as a meeting place for future presentations, I think that’s a very real possibility:

Some of the points in the presentation that you might not get from viewing it by itself. In the third slide, I mentioned that I had an epiphany in second grade that stuck with me. I went to a basketball game with my father and noticed that I couldn’t see the score on the scoreboard, even though everyone else could. I didn’t know what an “epiphany” was (I was in second grade), but I realized that everyone didn’t have the same perspective and that our vision shaped our world around us.

I linked to a blog post of mine in that slide (and to other blog posts from SEO by the Sea and other sites, and some patents as well in other slides) that describes a patent application that I looked at and wrote about back in 2004 that gave me a different perspective on the importance of geography in local search. I experimented with and tried several things described in that patent filing, and they made a difference. With that success, I started looking at many other patents.

In the presentation, I point towards the tremendous growth of Google’s patent portfolio over the past few years, from 187 granted patents assigned to Google at the USPTO in October of 2008 to 809 granted patents assigned to them in February of 2011, to 4,163 granted patents assigned to them on March 14th, 2012. Many of those patents were acquired from other companies such as IBM and Xerox and Verizon, but many were also developed in house.

While a good number of the new patents point to hardware including game controllers, desktop and network computer architecture, fiber optics networking, and so on, many also point to new approaches to search, including recently acquired patents from Xerox on scoring document quality.

New Approaches to Rankings

Representatives from Google have been repeatedly telling us that the company has been averaging about 500 new changes a year to its core search ranking algorithms as well over the past few years.

One place where this has been increasingly visible is in Google’s ongoing redefinition of “relevance” and what it means to present relevant results to searchers. At one point, matching keywords in a query to pages on the Web that contained those keywords was what search engines specialized in. Google brought some advances to what earlier search engines such as AltaVista and Excite and Lycos were showing us by using link analysis methods like PageRank to try to show us the most important (or at least most popular) of those pages at the tops of results.

In one of the slides within my presentation, titled “Expanding Relevance,” I show some of the results on a search for the term “Wilco” which gives us some web page results, including the home page for the band, and for a business with that name, videos in case searchers wanted to listen to the band, a tour schedule with links to ticket sales, news results for the band, and links to albums by the band. Relevance has expanded beyond just finding and ranking webpages that include the query terms on those pages to providing different ways to meet the intent of searchers.

The Wall Street Journal articles hint at a time when Google might better understand different attributes associated with specific entities and show results that might be more relevant. The results for the band Wilco are a good example of what we might see with other types of entities in the future. Google recognized that Wilco is a named entity and that there are attributes associated with it that it might be good to show in search results, such as [wilco tickets], [wilco schedule], [wilco videos], and [wilco albums].

Expect that in the future, when our searches include other named entities, or specific people, places, and things, that we might see a wider range of results that cover attributes associated with those entities. A search for Hawaii, for example, might include information about travel and tickets to Hawaii, weather, recent news, history, and other types of attributes that searchers might associate with searches related to Hawaii. What this might mean for someone creating pages on the Web about Hawaii is that it might be harder to rank well for general pages about Hawaii, and easier to show up in search results for different attributes people might be interested in related to Hawaii.

This type of information extraction to understand specific entities, and extract facts about them is something people at Google have been pursuing for years, and has been part of what they’ve been focusing upon at least as long as they’ve been using systems like PageRank. An early paper from Sergey Brin from his days at Stanford shows an interest in this kind of approach that goes back more than a decade: Extracting Patterns and Relations from the World Wide Web (pdf).

The presentation covers many other approaches to search and rankings, including Phrase-based indexing, Concept-Based Indexing, Using triples of data from large data sets involving users and queries and sites to predict pages people might want to see, building a knowledge base of aspects about entities, and a planet-scale distributed data system that can include both global and regional results. The growth of systems like these can mean less reliance upon exactly matching keywords on documents and upon excessive links.

The presentation also looks at some of the patents and papers that might be behind Google’s increasing use of social signals in ranking pages in both social search and eventually web search itself. Again, the use of signals like these can mean that some of the signals that Google used in the past might not carry as much value as they do now.

Over Optimization

Regarding Matt Cutts’ statement that Google may come up with an “over-optimization” penalty in the future to help sites that aren’t showing up as high in search results because of other sites that might have excessive links pointed to them or contain specific keywords more often, you can get the sense that this is something Google has been aiming at for years by looking at many of the patent filings and whitepapers from the company.

For instance, one patent that I wrote about in October of last year described how Google might identify when site owners take over other sites and use them to create links to their sites, using pages from the acquired sites as doorway pages. That can include links that might be part of private blog networks, or from individual pages that aren’t part of such networks.

Google’s Phrase-Based indexing approach also includes a method that might help to identify web spam based upon a statistically unusually high amount of related co-occurring phrases appearing upon a page.

Another Google patent that I wrote about a couple of years ago, Google’s Affiliated Page Link Patent, described how Google might limit the amount of PageRank that flowed from pages on one site to pages on another that appeared to be related in some manner, such as being under the same ownership or having some other close relationship.

An aim of good SEO is to improve the quality, relevance, and usability of pages for visitors so that the objectives of the owners of those pages are furthered, and people looking for what is offered on those pages are more likely to find those pages. Optimization, as a term, means to make something the best that you can, and in SEO usually aims at making a page the best that one can in terms of satisfying people using a query term that the page is about, to meet their informational or situational or transactional needs.

Some people promoting web pages attempt to use tactics like overstuffing a page with a particular keyword or pointing as many links to it as possible that use that keyword in the anchor text, without necessarily attempting to make that particular page one that will satisfy visitor’s needs. If you listen to the audio from the Search Engine Land post that I linked to in the first paragraph of this post, those types of activities aimed at making a page appear more relevant than it is for a specific term or phrase appears to be what Matt Cutts is discussing when he talks about an “over-optimization” penalty.

So a penalty like this might do things like ignore the value of anchor text in blog comments or forum signatures pointing to pages, lessen the value of links between sites that are related in some manner, lessen the value of keywords or related terms that appear on the same page at a very high rate, or apply some other similar approaches.

That doesn’t mean that the value of thoughtfully created, high-quality pages, following best SEO practices will be harmed. The goals of that type of SEO align with the goals of search engines in helping people find pages that help meet their needs.

Sharing is caring!

108 thoughts on “Predicting SEO Changes in Rankings, Algorithms, and Penalties”

  1. I have a feeling we will all have to be:
    1) more proactive over what types of links we allow to point to our domains after all is said & done.
    2) better informed of SERP landscape as Google will work harder to keep traffic within their own domain.

  2. Bill,

    Thanks for taking the time out of your busy schedule to present at SEER last week. You gave great insight into “relevance”, what search engines have done in the past and what we should look for in the future based on your expertise and the patents you have analyzed. It was great to see so many Philadelphia area SEO experts come out to hear you speak and then meet up afterwards. Anyone who hasn’t had the privilege of attended one of your presentations or talking with you one-on-one is really missing out.

    Also – as always, great post! I especially appreciate the way you ended the post with:

    “That doesn’t mean that the value of thoughtfully created, high quality pages, following best SEO practices will be harmed. The goals of that type of SEO align with the goals of search engines in helping people find pages that help meet their needs.”

    It’s top industry experts like you that make our (SEO’s) lives easier. Thanks!

  3. Hey Bill,

    Thanks for the post, great as always. One thing I find myself perpetually interested in in terms of “where things are going” is where current strategies are sustainable and not. The idea of entities becoming stronger and more entrenched is a given – however, do you see anything else just getting “wiped out” in the future? In example, this happened with national sites attempting to rank regionally when local results started dominating SERPs. I’m not sure it’s in your core thought process (or ideology) to make these kinds of prognostications, but I would love to hear them. In example, I see things like “PRODUCTNAME review” or “best PRODUCTS” being mostly spam intended to get affiliate payouts rather than truly quality content.

    Appreciate your thoughts!
    Ross

  4. Over optimisation penalties have existed for a while now, Google goes on and on about introducing this and that but to be honest none of it seems to materialize. Didn’t they say that they were going to build a tool to fight paid links?

  5. What do you mean on slide 30: “Promptness in relation to the timestamp of the original location”? Is that saying that a comment/reply etc. is – all else equal – more relevant when it is made closer to the publish date? i.e. to combat rogue comment spamming on old posts?

  6. The problem with “Penalizing” for SEO efforts, is that this can be used to downgrade big players in the game, by spamming them with cheap, relevant links (this is probably already done to a certain extent) except that now for sure, over optimization will bring them down.

    Anytime you induce a “Penalty” into a system, it can easily be turned into a weapon, especially in a system that does not reward you completely for DIRECT efforts to improve your rankings.

    If I had enough time and budget, and I was #2 in a major industry and couldn’t beat #1, why not screw up their link profile by “over optimizing” their keyword links and bringing them down while you do the right thing for your own site?

    This is where the game gets dangerous.

  7. I’m skeptical of any use of links to penalize websites, because it would be so easy to game. My inbox piles up with people offering me thousands of links for under a dollar a link. If I’m a mesothelioma lawyer, where each case can result in a seven-figure recovery, I have ample funds and incentive to spend $10k and annihilate my competitor with spammy “mesothelioma asbestos lawyer lawsuit attorneys settlements” links.

    That’s why I think Google has gravitated to the ‘dampening’ model of spam, where they don’t penalize for it — unless they know you’re selling links — but instead just try to limit the benefit. That can be gamed, too, of course, by simply ramping up your spam efforts, but the good part is that there’s no way to attack competitors with it and the benefits are (hopefully) limited.

    Then again, as someone who is routinely outranked by competitors who have never had one legitimate editorial link pointed at their sites, I’m happy to hear Google is at least making noises about fixing the problem.

  8. I must admit that I am looking forward to this ‘penalize over-optimized sites’ initiative with some sick relish. Yes, a lot of sites will probably be hurt by this, including some of my own – but deep down I’m hoping it totally upends the current paradigm for something new and exciting.

    That being said, I doubt these new algorithms will be anything revolutionary. It’ll more along the lines of the small incremental changes Google have been doing for years.

  9. Bill,

    I can see anchor text from comments and forums (especially forums) being downgraded pretty soon. It’s a shame that many abuse this approach to back linking and the rest of us have to suffer. Not sure if Google is intelligent enough to notice when a quality comment is added and hence the anchor text is not penalized.

    We shall have to wait and see what happens.

    Andrew

  10. Hi Bill,

    As you saying Google looking forward to penalized over-optimized web-pages to help the sites which are not showing up as highly in search results just because of other sites that might have excessive links pointed to them. It’s Rocking one

    I sure lot’s of dark minded webmasters & website owner has started worrying bout their search future.

  11. Thanks for a very interesting post.

    I really like the way “relevance” is finally starting to align with the upcoming search results. Would be great to finally see the real relevant web pages ranking in Google. I just hope they don’t forget the minorities this time. They have a tendency to forget smaller countries and only do changes on UK/US based languages. Even though the large majority of the internet is on English I still think relevancy should be the main focus world wide.

  12. I think having a 360 degree view of one’s SEO is more important than ever before.

    1) You have to keep an eye on the links you build, as well as the one’s you don’t (built by the competition to undermine you, scraper sites etc)
    2) You have to keep a relative gauge of your level of optimization vs your local competition, and the industry in general.
    3) You need to take a good look at your site content and see if it’s truly providing value. Good enough ain’t going to cut it.

  13. Thanks Bill for the post. Though I’ve got more questions than answers now. The biggest question might be “When is a site over optimized”. I mean it should be optimized but when it’s too much?

    I’m quite sure that google already applied some of its changes. In one of my sites I noticed a decrease of 60% in 5 days. Another site of mine has an increase of over 25%. But don’t ask me where the differences in optimization, code etc. is. I simply don’t know.

    Again, thank you so much Bill.

  14. I think it is easier to predict long term changes than what happens in 6 months. Long term I am fairly confident the things I notice that are bad will be reduced. So yes, when I can see some lousy forum that doesn’t let me see the content I came to see shouldn’t show up – I expect Google will fix this (they did years ago for that site that I forget). When I notice that far too many bad results I see today have my search term in the domain name I expect Google to dial back how heavily they weight that in the future. It takes Google far longer to adjust to these things than I would expect.

    Some of Google’s moves have to counter previous moves by Google. Google decided paid links are bad. But links that are due to $ but in a different way are fine (so if you completely buy a site, say, instead of paying for a link). Given that move by Google they encouraged certain behavior which as it become too common creates worse SE results.

    The words “excessive links pointed to them or contain specific keywords more often” sound a bit weird, to me. But I realize that Google will base excessive on numerous measures. It isn’t the number of links it is the excessiveness of those links given the context (cnn.com is likely to have tons of links but maybe not excessive given the context).

  15. I saw that video earlier this week. I think Google is really trying to bust up some of the more aggressive online marketers who automate many aspects of SEO, purchase sites for the pagerank, and keywords stuff to get Google to rank them better than sites that a user is more likely looking for. I think the on-page stuff has been going on for quite a while, with the Meta tag, Alt Tag, and other tags receiving less relevance, along with a lot of on-page attributes. Seems this just goes one step further and aims to further find link wheels, and other black hat methods for gaining position.

  16. Thanks Bill as always shining the light into the darkest corners of the some of the most powerful secretive companies on the planet. The force is with You

  17. The over optimization penalties fascinate me. I could see a lot of SEO scrammbling to embrace Rand Fishkin’s idea of inbound marketers should google ever get that one exactly right. At that point it would no longer be enough to simply add some keywords in page titles and buy a few high PR links. That’s a potential big time game changer IMO.

  18. Pingback: Google Zen – Finding Balance in Site Optimization | Beanstalk's SEO News Blog
  19. The real problem about going after Over Optimisation is that you need to define what is a Good Level of Optimisation. If nobody was over optimised, then why would people need to do it ?

  20. The game is changing! I am waiting for them to start using comparative user engagement statistics (if they haven’t already in the latest algorithm updates)… I think that would be a huge catalyst in pushing companies to make their websites more like applications and less like billboards :)…

  21. Pingback: Links from around the web: March 22nd 2011 | Flippa Blog | Buy & Sell Websites
  22. But when is a website over optimised? There’s no clear definition when a website is over optimised, so that’s the real problem indeed.

  23. I guess to remain in Google’s good graces we just need to think of users first when we optimize a site for search engines.
    Granted, it takes time to make quality relevant page titles, but it’s always better than seeing keyword lists as links in the SERPs.
    If I were a user, I wouldn’t check these “spammy” sites, but I’d rather click on the links that invite me to do so.

    That’s always a matter of balance, and in the coming months, we’ll have to pay more attention to what the right balance is between SEO and user experience.
    I do not think it is going to change the way we work today.

    BTW, thanks Bill for all these excellent articles. My first comment on your site, but I’m a fellow reader of your posts 😉

  24. The over optimization announcement was just a distraction from what they are really doing right now. Essentially they are targeting websites and threatening to penalize them if they don’t name the SEO services they used. Google’s algorithm can’t catch these networks/tactics so they have to have humans go in (once they are identified) and de-index them.

  25. Thanks Bill for the post. Though I’ve got more questions than answers now. The biggest question might be “When is a site over optimized”.why they penalize OVER optimization?The real problem about going after Over Optimisation is that you need to define what is a Good Level of Optimisation.

  26. I find that SEO is unavoidably necessary but I also can appreciate those offline tactics as well, especially if they tie in with online strategy. For instance, handing out business cards with a URL to a specific landing page created for people who have my cards.

  27. Good Article. In my point of view I think we can only reach to a conclusion in about few months when we see the changes in our sites SERP´s in Google. Only then we will have a clear view what it actually change.
    However I do think SEO will become a more white area……

    Bottom line – relevance was always number one priority for Google

    Cheers
    A

  28. Thanks for the interesting and informative article Bill. The conflict between general entity and attribute content highlights the problem that search engines, no matter how sophisticated, cannot respond with a consistently high degree of relevance to user intent behind general queries, because not enough information is provided by the user. For this reason, I believe that it’s a mistake for Google to delve too deeply down into the branches of attributes on general queries. If I just put in ‘France’ into a search engine, chances are that results that include specific detail on tickets, wine or Napolean are too specific for my needs. They should be building on existing SE behavior, that is encouraging people to articulate what specifically it is that they are looking for, rather than encouraging laziness by ruling out legitimate and useful general results on broad topics.

  29. It has already been mentioned here that there is a fine line when applying a penalty for “over optimization” IMHO rather than highlight this penalty Goggle should just do it, whereby not giving the green light to those who wish to unfairly try and take their main competitors down.

  30. Well I think we seen some good examples recently of how over optimisation is hurting sites. If you have links from a blog network you are over optimising, if you have links from any site that has more outbound links then inbound links it seems like your over optimising as well. It is no acident that blog networks are getting de-indexed. Penalties for unnatural linking are happening more and more.
    Is this a game changer? Only for those who are gaming the system.
    Nik

  31. I agree with the above, surely Google realise that by penalizing sites for links, that this WILL, without a doubt, lead to many companies targeting their rivals to sink them.

    I mean, come on, it is happening every single day. I see it all over the place, the techniques that competitors are using to sink their nearest rivals have existed for a while, and if Google really do start penalizing websites with excessive links, then the searches are going to get very messy.

    That whole situation would lead ammunition right into the hands of everyone of your competitors.

    Surely Google are more intelligent that to allow that to happen?? I certainly hope so..

    Tony

  32. Thanks Bill for the post. The new ‘Over Optimization’ penalty is going to be tricky to track. I see link diversity and branding links playing a bigger role than they have in the past.

  33. Hey i was curious if this means that exact keyword anchor text over and over could now be a bad thing in the eyes of google and we would be better off making subtle changes to the anchors?

  34. Pingback: How to Write an Article for SEO | Professional SEO Writer
  35. There are some easy wins Google could accomplish quickly to overcome attempts at over-optimizing sites and backlink building. I was watching the “SE Nuke” how do they do it video. That product pounds on every blog it can discover, posting followed, linked comments whereever it can. If Google just said “we’re not counting links in comments” that would get rid of a lot of the abuse and skew. Some blogs do no-follow links in the comments, but Google could use but flag those links. They shine bright as day with “comment” classes in the mix. That would leave a smaller volume of more legitimate links where the site owner desired their addition to the site in the first place.

  36. I wonder how much is too much? I use the words “Nashville Web (or website) Design” 4-6 times on my homepage and in the title of all my navigation links. Too much?

  37. It looks like quickly becoming SEO is about added extreme value around the web and building up a trust rank with your team’s social profiles. I think participating and digitally and added real value online is become paramount. It’s looking like companies will need to invest in real “marketing”, creative ways to attract attention. Gone are the days of spammy links, blog networks, crappy article marketing, etc. Now it’s time to invest in leaving comments that add to the conversation not spamming sites with crappy thin comments for links. Now is the time to invest in solid content and getting guest posting opportunities as opposed to the spinning some crappy article to syndicate to a blog network or some silly link wheel. Now is the time to really, really care and become a real contributor to the web. I am glad this is the direction things are going in. I am on board with trust rank and adding value to the web. I can’t wait for spam to get nuked.

  38. Unless the levels are defined as to what is good and bad then how can the little guys who are trying to do the right thing avoid problems. In general these sorts of issues affect smaller players more than bigger as the big boys understand how to bend the rules to their own advantages.

  39. Over optimization is really bad practice which is being done my many of the people these days in order to rank the site faster most of them dont know that it is not permanent. I came across few clients who asked me for instant results. Though I explained them in detail they are not in a state to understand.

  40. This post is great, we should pay more attention to what the right balance between SEO and user experience because I think Google has been biased toward the model of ‘shock’ spam, where they do not punish for it – unless they know you are selling links – but instead tried to limit the benefits. Thank you very much

  41. Thanks for your post Bill. It makes me think that someday, and PersonRank will be as or more important than PageRank.

  42. Thanks for that useful post. Honestly I think that there will never be anything like a over-optimization penalty because thats pretty nonsense. Why bigG shouldn’t be happy if you “over-follow” it’s guidelines? IMHO, Cutts was just talking about gray/dark-gray hat SEO techniques…

  43. I have actually been waiting for this for quite some time, as far as the relevancy is concerned and the focus on returning better results as opposed to those pages that are just the best optimized. As a general search I just get so sick of all of the junk pages that turn up on the front page for what I want know and learn. Made for Adsense pages are super annoying and I hope they find a way to change that. Similarly those with an absurd SEO budget that allows them to buy thousands of links of all degrees may not get punished. The ones that have a high enough budget to spend on a multitude of different anchor texts, all over the net, in different varieties are likely to be okay.

  44. An over-optimization penalty may help level the playing field a bit between highly relevant sites and those that may be less relevant, but are highly efficient as generating backlinks, etc.

  45. I always figured it would be a matter of time before Google starting inflicting a penalty for over-optimization. They’ve already started with the link building portion of this process with algorithm changes such as Panda. Now it seems they are moving towards the on-page side of things. I think this will soon pay off for all of the newly “converted” white hats, you’ve got a head start on claiming some new territory that was once held by spam.

  46. Over-optimized websites? But what are the limits? Are the blogs with over 30 edu backlink overoptimized? Or blogs with over 30 keyword title tags? I think, google will focus on social media. The numbers of your twitter followers or the engagement rate in your facebook account that includes a backlink to your website will be curious. Social Media Optimization is going to be more important.

    Secondly,i think if you get many backlinks from same ip, Google can say: Hey, stop, there is over-optimization,you deserve the sandbox.

  47. Hi Bill good article.

    I disagree with over optimization though, how could this possibly work in an algorithm that has to cater for the juggernauts of the web (Facebook, Twitter, Mashable… the list goes on) surely the insane amount of links they receive will not be detrimental to them??

    Its a good idea and would reduce a lot of spam but i do not see how in a crowded digital space it can be put into practice.

  48. Hi Brent,

    I remember being interviewed back in 2005 by someone at the Economist who asked me about Google Bowling, and I was actually a little surprised that he would ask me that, but my answer today is probably very similar to what it was back then. We often don’t have much control over who links to us, but we can work on presenting and providing content that a search engine considers editorially given from quality sources, reliable, and authentic. By attracting and acquiring links like those, we are building some protection for ourselves that can significanly lessen the potential for some negative seo attempt to harm our pages and rankings.

    For example, I’ve been fortunate to have a number of pages from this site mentioned in patents at the USPTO as “other sources” cited by patent examiners when they prosecuted patent filings. Those aren’t links, but since Google has a patent search, I know they’ve come across those mentions. I don’t know if they give them any weight as a trust signal, but I suspect they could. I’ve also had some mentions and links from sources like the New York Times, the Wall Street Journal, Bloomberg News, and so on that I think also send positive signals. Links and mentions from places like Chambers of Commerce sites, local and regional newspapers, and others that might be seen to be authoritative in some way also hold the potential to help sheild a site from actions that others might take against your site.

    I’m not sure how proactive we can be about who links to us, but we do have the power to present ourselves and our sites in ways that can attract some positive links as well.

    I agree that it does make sense to try to keep aware of what is showing up in serps that might be related to your business, watch what might be linking to you or possibly scraping your content, and try to keep an eye on conversations that might be about your site, and your business and organization.

  49. Hi Chris,

    Thank you. I really enjoyed the visit to Philly, and it was great to catch up a little with you, and meet a lot of the people involved in the search marketing scene in the Philadelphia area. The team as Seer Interactive were incredibly hospitable as well.

    We definitely need to work together and share if we have any hope of growing and advancing in what we do. And that’s part of the fun of being involved in the industry as well. 🙂

  50. Hi Ross,

    It can be a little tricky making some predictions, because there is just so much information that we don’t have access to, from possible pending acquisitions of companies or patents, to business issues involving competitors or best choices amongst alternatives, to the results of testing and experimentation that might determine if one approach is better than another.

    The movement towards showing more “localized organic results” makes a lot of sense because while there’s a strong likelihood that people want to see some local results, they might also want to see some national results as well, and inserting some local result into national or global results is a nice compromise. For a local business or organization to appear as a local result, they need to have taken some steps towards showing that they are indeed local.

    There is at least one whitepaper from Google where they mention that searchers do want to see reviews, and Google is trying to take steps to surface reviews from a “reviews” link in sidebars to the smart snippets that do show star ratings. I don’t think Google is going to try to do away with reviews either.

    If I were to make a couple of quick predictions, one would be more specialized markup involving authorship, bringing syndication meta data from Google News to all websites, where people who implement authorship markup can use meta data on their pages to show where else that content might be syndicated to. So if you write a blog post, and also syndicate it to another site, you can include meta data that shows the URL where it appears as syndicated content.

    Another might be A return of realtime search, which would include tweets, but possibly only from people who are actually contacts of yours on twitter, and you would have to be logged into your Google account to see those realtime results.

  51. Hi Yousaf,

    Google has been taking many steps in the past to impact purposeful manipuation of search rankings, and we can look back to 2003 for one of the most visible of those actions in their devaluation of the PageRank of the Search King site, and the legal actions that followed that.

    Google made the announcement about taking some significant steps to fight overoptimization, and then seemed to devalue a good number of links from link blog networks. The “announcement” was in reality an aside from Matt Cutts in response to a question at the SXSW conference rather than an official announcement, and it’s Matt’s ongoing job to fight web spam that impacts search results negatively. I suspect he’s constantly working on an algorithm or two that could be said to tackle “over optimization.”

    We know the term “over optimization” is an oxymoron, because optimization means to improve or make something the best that it can be, and you can’t make something better than the best it might be. What he admitted in that conversation, to put it in context, is that Google would be looking at areas where people attempted to stuff a lot of extra keywords into pages or pointed a bunch of links that were only created purely for ranking purposes rather than to deliver traffic to pages, or were editorially judged and given links.

    I don’t remember a statement about building a tool to specifically target paid links, but I do remember a “meet the crawlers” session at Search Engine Strategies in 2005, where a Google Engineer was discussing some possible ways to identify paid links. Honestly, people who build big paid links networks and advertise them publicly on webmaster forums, and send emails to Matt Cutts asking him if he wants to buy some links (Matt tweeted about that a couple of weeks ago – they even agreed to send him some examples) may not make it too difficult for Google to find them.

  52. Hi Alexander.

    Anytime that a search engine makes a change to a ranking algorithm, they need to ask themselves some serious questions about how that change might be abused by people attempting to manipulate search results, whether that might mean inflating their own rankings or somehow deflating others. It’s possible that there might be some collateral damage when that happens, but the idea of Google Bowling has been around a long time.

    So imagine that someone buys a bunch of links from a promotion hungry blog network that makes themselves very public via affliates, forum postings, email spam, and more. Those links don’t point to their own site, but to their competitors’ sites. If Google just decides to devalue those links, the competitors aren’t harmed at all. If Google decides that it might impose some penalities, it’s possible that it might take some steps to determine where those sites were ranked before the links were purchased, how much of an impact they might have had, if those sites were ranking where they were without the purchased links anyway, and determines that it’s unlikely that those competitor’s sites would have bothered to purchase those links in the first place, so they don’t bother to penalize them. Now imagine that they had more than the 3 minutes that I took to come up with something even a little more sophisticated than that, and remember that they have lots and lots of computers and lots and lots of data to come up with a method that might reasonably predict whether or not a site might have purchased the links pointed to them, or that their competitor did instead.

    If I had enough time and budget, and I was #2 in a major industry and couldn’t beat #1, why not screw up their link profile by “over optimizing” their keyword links and bringing them down while you do the right thing for your own site?

    Because a search engineer could easily see that the #1 site was ranking # 1 before any of the “purchased” links appeared on Google’s radar, could see that they would rank there without the purchased links, and could reasonably assume that they were unlikely to have purchased those links in the first place.

  53. Hi Chad,

    That particular slide was describing the “user rank” algorithm developed at Confucius to try to determine a reputation score for people interacting on a social network. The following post describes that user rank approach in more detail:

    https://www.seobythesea.com/2011/07/how-google-might-rank-user-generated-web-content-in-google-and-other-social-networks/

    Someone who responds to a question or post sometime close to when it was originally published might receive a higher score than someone who waited weeks or months. But it’s not a signal that’s looked at in isolation, and would be considered with other signals as well. And one of the reasons why might be exactly as you state, to “combat rogue comment spamming on old posts.”

  54. Hi Max,

    I agree with you about the dampening approach, but lets look a little deeper into that.

    If a page ranks nowhere, and the only links it acquires are clearly from a blog network that was created solely to boost rankings in search results, and then gains some legitimate editorially given links afterwards, chances are that Google has some idea of when it first saw all of those links, and where the page was ranking when it acquired them. If a page ranks extremely well, and has a lot of seemingly editorially given links, and then gains a lot of purchased links that only cause it to rank even higher and those are identified as paid links, if those links are devalued and the site still outranks everyone, what are the chances that Google is going to penalize that site. I would say not very good. But Google might decide to penalize the first site that was boosted in rankings from nowhere on the strength of paid links.

    I’m sure that there are other signals that a search engine could look at as well in determining whether or not to impose a penalty, but I don’t think that they would just impose a penalty if they saw what appeared to be paid links pointed at a page. The concept of Google Bowling has been around long enough for them to find ways to identify it, and plan around it.

  55. Hi Andrew,

    Google did announce that they would be releasing a 3rd party comment system sometime in the near future. Chances are that it would be tied to Google Accounts. The originally filed Agent Rank patent filing included a system that could be used to mark and attach user badges to posts, comments, forum threads and more, and a Google Content Author Badge patent filing spells out something like that in more depth:

    https://www.seobythesea.com/2011/08/after-authorship-markup-will-google-give-us-author-badges-too/

    I suspect we’ll see Google implement something like that before they devalue anchor text from links or forums. Then again, Google’s Phrased base indexing and Reasonable Surfer approaches would likely devalue the anchor text and PageRank of those links as well.

  56. Hi Alex,

    I’ve seen some significant changes in rankings for certain queries, and I’m mostly pleased by what I see so far. If it’s the over optimization penalty algorithm in action, I’d like to see more.

  57. Hi Rajesh

    I’m not sure that the announcement by itself struck too much fear into anyone’s hearts, but things like devaluations of link blog networks are causing a lot of people to reconsider some of the approaches that they’ve been following.

  58. Hi Dennis,

    Google has been working to broaden “relevance” to better meet searchers’ intents on an ongoing basis, including blended/univeral search result, and it does look like we’re going to see a serious evolution of that.

    I’m not sure what Google’s policies or approaches to implementing different features are, but sometimes the reasons why things might be released in one place and not another might not be a matter of favoring one place over another.

    I suspect that a lot of features get released in English speaking languages/countries first is because a good percentage of Google’s search engineers speak English as a primarly language. It’s also considerably easier to release something in just one language and/or country and work out all bugs before figuring out the special needs of other places and countries.

    Then there are other factors, like for instance, the fact that latitude and longitude information is consider a munition in China, which caused Google to try to find a way for Google Maps to work there without using lat & long.

  59. Hi Dev,

    Very good points. I agree completely. While it’s important to focus upon the quality of your site and pages, it’s also essential to keep an eye on the framework of the Web around you.

  60. Hi Dr. Wright,

    Google isn’t penalizing for over optimization, because it’s impossible to over optimize something. They are penalizing pages and sites when they think people are purposefully taking actions specifically to manipulate rankings and take advantage of Google.

    Google’s webmaster guidelines say:

    Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

    and

    Don’t load pages with irrelevant keywords.

    If you listen to the statement about over optimization within its actual context, it’s those types of things that Matt Cutts, the Head of Webspam at Google points to as “over optimization.”

    See the Matt Cutts video, Does Google consider SEO to be spam? (the short answer is “no”)

  61. Hi John,

    I’m not sure that Matt Cutts would have phrased his statement about over optimization the way that he did if he intended it to be a specific announcement from Google, but I’m glad that he did give us some advance warning.

    There may be a lot of ways to interpret the “excessive links” statement, such as when a site suddenly starts acquiring a lot more links than it should, such as might happen when a link farm is set up where thousands and thousands of low quality and low ranking pages are pointing to a page on another site. Or when a site suddenly starts getting a lot of links from somewhat irrelevant pages on blogs that never used to link to sites that cover the topics that the pages being linked to cover.

    Buying links vs buying blogs? Google did publish a patent application last fall about How Google May Identify When Sites Transform into Doorway Pages. Buy a site and run it like it was in the past and maybe add a few links, and you might not have a problem. Buy it, and make a lot of changes, including removing different topics from old pages, adding lots of new topics, and linking to sites that might be irrelevant to the content on some older pages, and Google might devalue links from that site.

  62. Hi Sandra,

    A good starting point might be combing through the Google Webmaster Guidelines to see if you might potentially be doing something that they might construe as having violated one of those.

    Another step you’ll probably want to take is to go through the site carefully to see if there are errors or problems that might have contributed to a fall in rankings. Have you made some recent changes, updates to a content management system, installation of a new plugin, movement to a new domain, or addition of a lot of new content? You’ll also want to look outside of your site, and ask yourself questions like whether you have a lot of links to your pages, or if someone who had significant amount of PageRank removed one or more links to your site or went offline?

    You may also want to check to see if your competitors haven’t made some positive changes to their pages that caused them to start ranking ahead of you.

    Before assuming that you might have been penalized in some way, look for other possible reasons why your traffic has dipped, and things that you can do in a positive manner that can improve your pages and traffic.

  63. Hi Craig,

    I think you’re right on target. It appears that Google has been stepping up their hunt for manipulative methods that they’ve been warning about for a long time in their webmaster guidelines.

  64. Hi Matt,

    A lot of SEOs practice a holistic form of SEO that already encompasses the aspects of inbound marketing that Rand has been writing a lot about recently, including me. I wouldn’t even buy a link from Yahoo. 🙂

  65. Hi Charles,

    I think the last time I might have stuffed a page with too many keywords was when Altavista first launched, and their FAQ said you could use 1024 characters in your meta keywords tag. I had so many words in there to reach that limit that it seemed wrong even then.

    It’s likely that Panda is a statistical based prediction algorithm that attempts to look at features on pages to predict user engagement, without actually relying upon user engagement itself, but which can use look at actual user engagement metrics as feedback to fine tune the approach. I don’t expect those Panda updates to suddenly stop coming anytime soon. 🙂

  66. Hi Peter de W,

    I think the Google Webmaster Tools provide a decent start on an understanding of the kinds of things that Google likely doesn’t want to see.

  67. Hi Mathieu,

    Thank you, and thanks for deciding to leave a comment. It’s good to meet you.

    I agree that thinking of what’s good for a visitor to your pages is a very reasonable guideline, and can be pretty helpful in making some decisions about things you might include on your pages. Would I really want to visit a page to learn about something and be faced with some really low content created solely to provide a way to repeat the same word over and over and over again? Probably not.

    Does it make sense to stuff a page title, meta description, headings, with lists of keywords rather than interesting and engaging text that might convince people to click through search results and stay on a page when they arrive?

    When I read a page is it really about something, or is it about almost anything but repeating a lot of the same words or phrases over and over?

  68. Hi George,

    I think Google was being pretty sincere about working upon one or more algorithms that might work against keyword stuffing and excessive linking. Seems like business as usual in trying to tackle webspam.

  69. Interesting article. Google’s methods are changing so fast these days it is almost impossible to keep up with it 🙂 I started a new site and within 1 week 37,000+ pages were indexed and a few were ranking quite well. I think it still helps to get a few good links back to your site from very credible high PR sites. The benifit I feel of google increasing its technology is it will really help those sites that have been slowly growing for many years to benifit and there will be no get “popular” quick schemes that will work.

  70. Hi Ginie

    Google isn’t fighting “over optimization.” Google is fighting web spam, and has been targeting it for years. If you’re not doing things like buying links from priovate blog networks, creating excessive amounts of very low quality pages in order to link to other pages with certain anchor text, stuffing lots and lots of keywords into web pages, titles, headings, etc., and instead are following SEO best practices that improve the quality of your web pages, and attract links to your pages, you shouldn’t really have too much to worry about.

  71. Hi Becca,

    You sound as if SEO is a bad thing. It’s not, and never has been. There are people who do try to sell shortcuts that are explicitly against Google’s guidelines and call what they do SEO. If you asked the people at Google about those practices, they would likely tell you that it’s not SEO, but rather spam.

    Google’s not going to penalize your website for putting your URL on your business cards.

    If Google thought that SEO was a bad thing, they wouldn’t have published an Search Engine Optimization Starter Guide.

  72. Hi Alex JG,

    Relevanc is the number one priority with Google. Their aim is trying to provide the most relevant and useful results to searchers as they possibly can. Their success as a business hinges upon it.

  73. Hi Jess,

    There’s never been anything stopping searchers from typing out very long and very specific queries. Google doesn’t cap off searches at 2-3 words. Google is reliant on searcher activity, and how people search. Google performs a lot of experiments when they do things like include one-box results for queries, or blend universal results into pages. If people didn’t click upon those types of results, it’s likely that Google wouldn’t show them.

    What exactly is “existing SE behavior”? Just matching pages with keywords they appear upon died with excite, lycos, altavista, hotbot, and other search engines that didn’t try to do much to try to understand some of the intents behind those searches.

    Existing search engines are trying to do a better job of understanding the intents behind queries, and it’s not an easy problem. But I don’t see a problem in search engines trying to do that.

  74. Hi Joe.

    There’s no such thing as “over optimization.” There’s no fine line. 🙂

    If someone goes to a forum with the intent to find places where they can buy links for their website, they aren’t doing SEO. They aren’t optimizing their web pages. Instead, they are trying to game Google.

    Google has been working on algorithmically and manually penalizing web spam for years, and is growing increasingly better at it. I expect that they will continue to do so.

  75. Hi Nik,

    I agree, but I just really can’t bring myself to call that type of intentional gaming of the search engines “over optimization.” 🙂

    It’s not optimzation, and there’s no such thing as over optimzing something.

  76. Hi Tony,

    This may be why Google started sending out out unnatural link warnings in Webmaster Tools.

    I suspect that for the vast majority of sites that purchases links or that might have had links “purchased for them,” Google likely just devalues the extra links pointed to them. As I mention above, Google shouldn’t have any problem identifying where sites were ranking when “purchased links” were pointed at them.

  77. Hi John,

    I’m pretty much to the point where I believe that Google may just be taking more steps to identify web spam. If you’re doing SEO, that shouldn’t be a problem. If you’re working on attracting and acquiring links by producing great content, by building smart relationships with others to cross promote your site or sites, by interacting with others in meaningful and positive ways, then those are good things.

    Give people reasons to link to what you create on the Web; solve problems for others, build creative and engaging content, provide something that people want to share with others.

  78. Hi Mike,

    I’ve considered removing the dofollow plugin on this blog a number of times, especially when I see it listed on some “blogs to get links back from” lists on spammy forums and on blogs that tend to be aimed at gaming search engines. I delete a lot of blog comments that use anchor text instead of names in the name field, regardless of the quality of the comment – I have some very clear text explaining that above the text box where aomeonw comments.

    I appreciate comments left by people who really do want to leave thoughtful comments that help expand upon what I’ve written, or include meaningful questions, or hold unique perspectives. I don’t mind a link pointing out to people’s pages when that happens. I’m not going to let Google tell me that I can’t do that.

    Chances are, that under something like the reasonable surfer model, Google really doesn’t pass along a lot of value in comment links anyway. People pursuing a path to getting links from comments could probably be spending their time in much more productive ways, even if they use some automated tool. The great thing about someone using automated tools to comment spam on a large scale is that it becomes easier and easier for a search engine to identify it happening as it gets scaled up.

  79. Hi Thomas,

    There are a lot of ways that you could send a signal to Google that you are a web design company in Nashville that don’t necessarily have to include repeating the same phrase over and over and over. Host some design meetups in Nashville. Include testimonials from Nashville clients. Include directions to your office on a page on your website. Write about tech events in Nashville on your blog. Design some local websites pro bono for some Nashville nonprofits. Join a local Chamber of Commerce. Be a part of your community, and include content and commentary about your community.

    Optimize for local search by having your location and address information in telecom directories, in business profile sites, in regional directories, on social networking sites like Facebook and LinkedIn and so on.

    Google probably isn’t going to penalize you for having the phrase “nashville web designer” on your home page 4-5 times, or in some anchor text on your pages, but if that phrase is so important to you, some online and offline networking with actual people in Nashville, and some other ways of showing that you do web design for companies in Nashville could potentially be much more helpful.

  80. Hi Matthew,

    All great suggestions on things that people who hadn’t been doing those types of things should start considering. Thanks.

    Those are things that many of us have been trying to do for years, and they can be very effective. I agree that they will probably become even more so.

  81. Hi Gaz,

    It is possible for small businesses to be successful on the Web without doing things like buying links or spamming blog comments or stuffing pages with keywords.

    The Google guidelines are decent starting points about the kinds of things that Google doesn’t want to see, but look at their help forums, at the starter guide on SEO that they published. Seriously question when you see people promoting methods to rank your pages that seem like they might be too good to be true. They are.

    Google does state pretty clearly that webmasters should avoid link schemes designed to manipulate rankings in search results. When you go to a forum and someone posts about participating in a link blog network that can help you climb in the rankings, it is pretty clear that goes against Google’s warning about link schemes.

  82. Hi Chris,

    Thank you.

    I agree that it’s likely that things like agent rank or author rank will play a larger and larger role in how pages are ranked in the future by Google. Here’s a recent post that describes tht movement pretty well:

    http://www.blindfiveyearold.com/author-rank

    The amount of work that will make it possible to run link blog networks effectively in the future will also make them prohibitively expensive to run. Maybe not today, and maybe not tomorrow, but regardless of the risk of those being surfaced and devalued by Google, they will likely just not be effective without real people with real levels of authority and reputation behind them.

  83. Hi Danilo,

    Thanks. I think you’re spot on. It is nonsense.

    There are a lot of site owners out there who have relied upon promises of easy rankings and haven’t worked to learn how to attract or acquire links to their sites in reasonable and sustainable manners. I’m hoping that many of them are taking the statement from Matt Cutts as a shot above the bow as a warning to look into and learn actual SEO.

  84. Hi Manendra

    There are a lot of site owners who want to run businesses online, and want to focus upon building their businesses and making them successful without having to do a lot of the work to promote their businesses. Many want instant results and are willing to invest in those, even with warnings that there are serious risks involved. In those instances, sometimes all you can do is explain the risks and move on.

  85. Hi Agus,

    Thanks. I think were going to see Google trying to be a little more active in finding and devaluing spam, and maybe doing some things to benefit sites that are relevant for queries but might not be doing the best that they can to optimize their pages.

    For example, if Google tried to start reading text within images in the near future, that might benefit site owners who didn’t realize that they should include things like their business names and their addresses in actual text on their pages. That would fall in line with Matt’s statement in that presentation about “making Googlebot smarter.” Google could also potentially do other things like that as well.

  86. Hi Scott,

    In an ideal world, the sites that are best optimized are the sites that are most relevant for a specific query. Those would be the pages are actually about the query used to find them, using title elements that actually describe what is on their pages, meta descriptions that describe those pages, Headings that really are about the content that they introduce, and so on.

    I’m not sure that the statement from Matt Cutts is pointing at something that is going to be helpful to pages that might contain great content but continue to do things like using the same generic title for every page of their site, include important text in images, and other things that aren’t “over optimation,” but rather little to no optimization at all. They might be doing something like that, but we’ll see.

  87. Hi Joel,

    Google’s always sort of focused upon a balance between relevance and popularity, with popularity measured by things like PageRank and the ability to attract links. Google might be looking to move that dial a little to possibly give “relevance” more weight. I’m not sure that the statement from Matt Cutts at SXSW gives us enough informaation to make a good guess.

  88. Hi Ryan,

    I think Panda has been focusing upon onpage content, too.

    Google’s been penalizing sites for web spam and link schemes for years. They might pick up the pace, but if they do, what they won’t be doing is penalizing anyone for “over-optimization.” That’s really just a misnomer for trying to make something appear to be more relevant for something than it really is, like trying to make a page about something just by the number of times you mention it on a page rather than actually making it about that thing.

  89. Hi Özenç

    It’s really hard to draw what lines Google might follow. I don’t think that Google will penalize sites for legitimate SEO practices, but rather the kinds of things tht they have been warning about in their Webmaster guidelines for years. I don’t think you need to worry, for instance, about having too many links from .edu sites, unless the .edu pages they come from are old and abandoned class discussion pages that are filled with spam. And Google might just devalue whatever those might pass along unless a much more sophisticated analysis determines that maybe a page in that situation should be penalized.

    With links from the same IP address, again it’s possible that Google might be, and even has been, limiting the amount of PageRank it might pass along from those links.

    I expect that we might see more social signals in the future that might influence rankings, especially with Google Plus around, and things like authorship markup, where Google can associate content with Google Profiles, and where Google can create reputation scores to associate with those profiles.

  90. Hi Sam,

    I don’t think that Google will penalize any sites for editorially given links pointed to them. I don’t think what was being referred to by Matt Cutts is really an “over optimization” penalty, but rather a spam penalty.

    Google does seem to have been very active recently in doing things like devaluing link blog networks, and I think that’s more along the lines of what Matt was talking about.

  91. Hi Greg,

    Thank you. Very good points.

    There are a lot of changes going on at Google, from the Search Plus Your World update, to many small updates that Google has been sharing about on the Google Inside Search blog (and likely many more that they haven’t said anything about).

    Nice growth on the new site within a very short period of time. I agree that some links from pages that might consider to be “very credible high PR sites” can be really helpful, and that’s unlikely to change any time soon. The get rich (or get lots of links quick) schemes do seem to be something that Google is targeting with a little more intensity these days.

  92. Thanks for this post. I feel that over optimization has always been a huge problem. I have many niche sites that I feel are a value to the internet and there are sites ranking above me that have one page of content and a huge amount of links.

    I know I am not the only person experiencing this either, it has happened to many great sites who should be #1 on Google as opposed to #5 because of their link count and them not using the keyword in every link.

  93. Thanks Bill, I’m constantly amazed at the depth that your articles examine and this was no exception. A very interesting insight into the direction that Google may be going.

  94. I think this is a win for the good guys. I’d much rather spend my time generating legitimate social media buzz than trying to create as many anchor text backlinks as humanly possible. See you later web spam.

  95. Hi Bill,

    Would article syndication as a link building approach be counted as spammy, and falls into the category of over-optimization?

  96. The problem I always find with the way the google algorithim changes is that by the time you have managed to come up with a way to make your website fit into their preferred way of listing sites, its time for them to change it again. I agree that the most important thing is content, but if google keep changing how they choose who shows up where then it gets to the point where you worry your content will not be seen!

  97. Hi Adam,

    I know that I have a number of pages that I feel should rank more highly than some of the pages above them. Sometimes Google’s rankings can be a little infuriating or frustrating.

  98. Hi James,

    Thank you very much. We can make some educated guesses where Google might head, but we don’t always have enough of the pieces of the picture to have a lot of certainty.

  99. Hi Henry

    I’ve never liked article syndication as a link building approach, especially if you syndicate an article that you publish on your own site too. You run the risk of a syndicated article outranking your own page in search results, and you might not get a lot of link value out of copies where they are published elsewhere.

  100. Hi Charles,

    I’m with you there. Do something interesting, something newsworthy, something different to create some buzz, and get some attention. It’s a lot more interesting.

  101. Hi David,

    I think there are some fundamentals that remain close to the same, but you’re right that changes happen fast. The good thing about social media is that your focus should be on finding ways for people to see what you create.

  102. Same thing the relevancy among the keywords must also exist ,,also the algorithms would be doing some improvements ,,, also the quality backlinks are considered a lot more as part of google strategy.
    However,,,still the penalties regarding spamming needs to be removed,,,same can be seen during forum posting and commenting ,the backinks are now totally being out thrown from google point of view.

  103. Hi theplasticman,

    There’s never ever been a guarantee from Google that every link pointed to a page is going to carry weight and PageRank, and there have been lot of people who have developed links in manners that work to artificially inflate the value of links.

    Some of these involve the use of automated tools that blast out thousands of links in blog post comments and on forums. Some of these involve the sale of links in blog networks filled with spun content or low quality content. Some involve taking over abandoned forums on .edu sites set up for classroom discussions that have finished serving their initial purpose.

    While Google may have not counted links from many of these sources in the past, it doesn’t seem to have stopped people from trying. If Google has started to penalize sites using backlinks like that, maybe they are doing so under the assumption that it might help stop some of that behavior.

  104. “3) You need to take a good look at your site content and see if it’s truly providing value. Good enough ain’t going to cut it.”

    That I believe is the key and what Google is trying to focus on. Essentially you need to have a site with good unique content that is not manufactured but is viable and something benificial. From there people will backlink naturally and legally to your site because they agree with the vision and the content. So to put more time and energy working on a great product or content rather then just trying to make a domain successful should be our goal.

    Great thoughts bill in the article.

Comments are closed.