Will Google Use Social Engineering to Fight Search Engine Spam?
If you change your site using what Google may consider spam, Google may not change the rank of your page in response, according to a patent filed by the search engine. It may increase your rankings, decrease those rankings, or make no changes at all to them.
You can see the kinds of tactics that Google might frown upon. Google’s Webmaster Guidelines highlight many search engine spam practices that they warn against, that someone may use if they were to try to boost rankings in the search engine in ways intended to mislead it. The guidelines start with the following warning:
We strongly encourage you to pay very close attention to the Quality Guidelines below, which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise affected by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or any of Google’s partner sites
A Google patent granted this week describes how the search engine might respond when there’s a possibility that such search engine spam practices might take place, where they might lead to improved rankings of pages in search results. The following image from the patent shows how search results might be reordered based upon such rank modifying spam:
These search engine spam practices, referred to as “rank-modifying spamming techniques,” may involve:
- Keyword stuffing,
- Invisible text,
- Tiny text,
- Page redirects,
- Meta tags stuffing, and
- Link-based manipulation.
While the patent defines these practices, I’d recommend reading the definitions in the quality guidelines on the Google help pages which provide much more detail. What’s interesting about this patent isn’t that Google is taking steps to try to keep people from manipulating search results, but rather the steps they might take when people may engage in rank-modifying spamming.
The patent is:
Ranking documents
Invented by Ross Koningstein
Assigned to Google
US Patent 8,244,722
Granted August 14, 2012
Filed: January 5, 2010
Abstract
A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.
When Google believes such techniques are applied to a page, it may respond in ways that the person engaging in spamming might not expect. Instead of increasing rankings of pages, or removing them from search results, Google might respond with a time-based “rank transition function.”
The rank transition function provides confusing indications of the impact on rank in response to rank-modifying spamming activities. The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.
Imagine you have a page in Google’s index, and you work to improve the quality of the content on that page and get many links to it. Those activities could cause the page to improve in rankings for certain query terms. The ranking of that page before the changes would be the “old rank,” and the ranking afterward would be the “target rank.” Your changes might be the result of legitimate modifications to your page. A page using techniques like keyword stuffing or hidden text might also climb in rankings as well, with an old rank and a higher target rank.
The rank transition function I referred to above may create a “transition rank” involving the old rank and the target rank for a page.
During the transition from the old rank to the target rank, the transition rank might cause:
- a time-based delay response,
- a negative response,
- a random response, and/or
- an unexpected response
For example, instead of raising the rank of a page when there have been some modifications, and/or to the links pointing to a page, Google might wait for a while and even cause the rankings of a page to decline initially before it rises. Or the page might increase in rankings, but to a much smaller scale than the person making the changes might have expected.
Google may monitor changes to a page and to links pointing to the page to see what type of response there is to that unusual activity. So, if someone stuffs a page full of keywords, instead of the page improving in rankings for certain queries, it might instead drop in rankings. If the person responsible for the page then removes those extra keywords, it indicates that some kind of rank-modifying spamming was going on.
So why use these types of transition functions?
For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero.
Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform extra changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
The rank transition function could impact one specific document, or it might have a broader impact over:
“the server on which the document is hosted or a set of documents that share a similar trait (e.g., the same author (e.g., a signature in the document), design elements (e.g., layout, images, etc.), etc.)”
If someone sees a small gain based upon keyword stuffing or some other activity that goes against Google’s guidelines, they might engage in some similar extra changes to a site involving things like adding more keywords or hidden text. If they see a decrease, they might make other changes, including reverting a page to its original form.
If there’s a suspicion that spamming might be going on, but not enough to positively identify it, the page involved might be subjected to fluctuations and extreme changes in ranking to try to get a spammer to attempt some kind of corrective action. If that corrective action helps in a spam determination, then the page, “site, domain, and/or contributing links” might be designated as spam.
If those are determined to be search engine spam, Google might investigate further, ignore them, or degrade them in rankings.
Google did come out with a similar patent involving local search, which I wrote about in How Google May Respond to Reverse Engineering of Spam Detection
What do you think of this approach?
Updated February 7, 2020.
Once I had a mail from google which indicated that you have been dealing in building illegal links and asked me to remove the links from the websites where you find that the links are Sammy. But i was never into building illegal links and going to google web master i found there were some sites which provide the informations about the sites like my blogs latest posts so my around 500+ links were coming from that site instantly and that was the reason for getting the notice from google.
I’ve read the above post a couple of time, and somewhat understand the way you’ve attempted to explain this. In many ways, I’m still confused.
I wonder if a site getting “STUCK” in the SERP, say on page three (3) or any other page, would have something to do with what you’ve indicated above. I’ve been looking for reasons as to why a url would get “stuck” in the search engine report pages, but…have still yet been able to come to a fair conclusion.
Would you say that this has anything to do with it, and if not: I wonder what would cause a “ranking stick”.
It’s an interesting idea, but I can see a big problem for agencies that perform legitimate (i.e. entirely within the guidelines) SEO.
We concentrate on white-hat SEO – quality content, good HTML markup, good internal links between relevant pages and real external links from a range of sources, most not under our direct control.
If our client see a rank for a page, section or site we are working on suddenly drop because we have made improvements to a page, we may well struggle to explain it. ‘It’s just one of those Google things’ does NOT go down well. We may get all the ranking (and more) back the next month but by then it is too late – the contract is cancelled, and the client may not even be aware their ranking has recovered.
Of course, most clients are reasonable, and are aware of the time it can take, but seeing a negative result after the first months work would be very difficult, especially for a new client.
Jonathan
SEO & PPC Specialist, Splice Marketing Ltd
Hi Bill, thanks for sharing. I wonder if this related to Google’s Next Penguin Update that Matt was alluded to at SES SF earlier this week (described here). This is patent is both brilliant and terrifying at the same time. It makes sense and will ferret out a lot of SPAMMERS, but will kill many real SEOs promoting legitimate websites as collateral damage. If I understand your interpretation correctly, the key is to not over-react to changes in the SERPS and to get really familiar with the Google webmaster quality guidelines. Based on your experience, any idea on timeframe to when Google will reduce this to practice/implement/roll out updates as a result?
They’re smarter than they look.
This approach would certainly help explain why Google has waited so long for to apply a meaningful Penguin refresh.
If so, I would say Google has been using this time to accumulate sufficient observations of webmaster activity to implement this new criteria.
The new patent might also offer some new insights into Matt Cutts and Googles’ recent statements not to overreact to Unnatural Link Notices; as well Cutts’s (on 8/16/2012)observations that a significant number of webmasters are going to be surprised by the results of next Penguin refresh.
Very interesting.
This goes quite some way to explaining the often random response you first get when you start link building to a specific page.
It makes sense; if they where to give an immediate and positive response to a manual SEO change then it would lead the way to much more accurate backward engineering attempts. By inflicting a negative response to a positive change, it keeps us guessing as well as unveiling spam efforts.
Damn brainiacs.
This is really fascinating stuff. Essentially Google uses transition rank to screw with spammers, causing them to react in ways that makes it more easy to identify their behavior.
I really enjoyed this post Bill!
The rank transition function is interesting for at least two reasons.
First, the “wait-and-see” approach is an excellent way to let spammers incriminate themselves. If you’re hypersensitive about your rankings, you’re bound to try all kinds of things to influence those rankings. By observing those activities in a rankings sandbox, search engines can gather additional data points without compromising the “integrity” of the SERPs in the process.
Aside from its impact on spammers, the function also has an interesting effect on search-related research. Many studies rely on the relative ranking gains/losses of pages with different modifications to help nail down the weightings given to various ranking factors.
Obviously, if the rankings incorporate elements of randomness (both in terms of how rankings will respond to page modifications and in terms of how long it takes for those rankings to change), those studies are very likely to draw incorrect conclusions.
One of the key takeaways from all of this is pretty simple: if rankings are your KPI, you’re doing it wrong 😉
This patent really makes sense and I feel that they have been doing this for years. When looking at historical ranking reports for keywords we are targeting we almost always see dips before rises, but the trend is always upwards.
I can see this transition rank function being applied to many types of content improvements in competitive industries, not just overt spam. As an SEO, it’s very tempting to not roll back changes if you roll out a change and rankings immediately head in the wrong direction.
Google makes it difficult for webmasters to promote their websites….But I agree with this article that spamming must be penalizes….Copied content is another form of spamming….Thanks for focusing Google’s ranking system and providing some tips to improve our ranking….
Google’s ego really knows no bounds. Sadistic geeks, the lot of them.
This is a really interesting approach to spam detection. I can’t say that I hate their approach entirely but anyone who cares about their website will make efforts to increase their relevancy,ie gain in rank and some entirely positive non-spammy sites will get hit.
If most people see a sudden dip in rank after making a change to their site they will try to revert the site back to the way it was. Some white hat SEOs may know better than removing said changes but what about other people. Many regular people who aren’t trying to game the system may see this dip in their local business site and panic, that may inadvertently ruin their site all w/o the help of any SEO, whitehat or blackhat.
Google may be doing themselves a disservice. They want to keep manipulation of sites out of the mix but as regular people who don’t follow trends and changes see their sites plummet they may be running to professionals to help unscrew innocent sites. But that’s my opinion.
Sounds like a military style triangulate and destroy tactic. Nice!
Fascinating stuff…Google is more or less penalizing sites who actually monitor and pay attention to rankings and search traffic. Most knowledgeable webmasters will at least _try_ some changes if they see a major drop in rankings.
This is a really smart filter by Google, though, because it causing the webmaster to “out” himself. Google doesn’t have to invest the time deciphering whether or not the site is using malicious tactics if the force websites to make it obvious.
To add to Stacey’s point, very few website owners will understand this change nor will they know to tell their newly hired seo company that they made these changes and that is what affected them. As a result, seo companies will be getting a lot of new business because of these changes. Many SEO companies will also have a bit of a clientele shuffle as a result. On a bad note, if an seo company does not communicate these changes to their clients NOW, they may also be caught in the shuffle.
The problem here is that you’re counting on Google to determine what is spammy is not.
What happens to a site hit with negative SEO attacks?
What happens when talking a lot about a particular topic APPEARS to google as keyword stuffing, when it isn’t?
What happens when I have ten guest post requests, and I write something for all of them, only later find out that one of them was connected to a link farm?
So, basically, no one ever do anything to their sites that would increase their rankings. Because if you did something and it modified your rankings, then it’s probably spammy.
I agree with Steve that this patent is the result of a strategy that Google has been devising for several years. It’s certainly helpful that Google is trying to combat spam, but we’ve noticed that some useful sites have experienced a decrease in rankings as well.
Fascinating stuff, thanks as ever for describing a complex patent in plain language. Fair play to Google’s web spam team, their cunning equals their technical sophistication at times.
Bill, would you foresee this being applied as a manual intervention on specific sites, perhaps prior to applying a penalty as per the penalties that precede some websites’ unnatural link warnings, or could this be rolled into the main algo? The latter seems unlikely to me, as the processing grunt required to apply this to their entire index would surely be too burdensome?
I’m not sure if the rank transition algo is already taken in place but yeah, looks like they already have it rolling. Just last week, I was doing some kind of onpage optimization for better site structure on internal links and I was surprised that the rankings went down right away after a day, so I thought Google may not like it and so I rolled the changes back. The rankings went up after a day, better that it used to be before the changes but after 3 days, it went down again a little lower than the the rank after I did some changes. It could be that they detected a similar activity of those spammers even thought I’m not doing any spamming at all.
I’m interested in Drew Allen’s conclusion and I agree with that: “So, basically, no one ever do anything to their sites that would increase their rankings. Because if you did something and it modified your rankings, then it’s probably spammy.”
So why use these types of transition functions?
Simple. So that SEOs can’t figure out what works and what doesn’t. Its a great way to keep spammers from manipulating search results.
Keep them off balance and keep them confused and annoy their customers in the process.
It seems to me that one of the only ways to proceed as an SEO is to add a time delay to your official results…meaning, don’t consider your efforts as having taken effect until things settle down after adding content and building links.
Mark
I guess this is what Matt Cutts meant when said that SEOs and webmasters will not like the new update.
.. let’s just hope this doesn’t prove to have more false positives than actual positives.
Very interesting way to handle spammers. Also as stated above one can wonder why they get locked for a term on a certain placement or page.
The transition function is a very interesting patent. I suppose the best thing to do is to continue to follow google’s guidelines as closely as possible, and not to react negatively to initial movements in SERP rankings. I’d also say that it speaks for making bulk changes for certain activities at spread-out intervals rather than making lots of small changes on a regular basis.
Recommendations here point to becoming intimate with the Google Webmaster guidelines…read, reread, and take care with on-page optimization. Good advice for any SEO.
WOW. I’ve seen this before but never had a Name or Label to attach it to. I just called it a “change-sandbox.” Seems like my holding period was about 2-3 weeks.
Thanks Bill
Very interesting and quite a brilliant way to weed out spammers but not only them I would say. It seems that all action taken by a webmaster or SEO after an unnatural links warning message will make him look like a spammer, almost makes you think it is better to not do anything so no red flags are raised.
Very interesting. What is your take on transition period rankings and links. It seems a bit messier to me than on site changes. I’m thinking the following:
A page gets a few links
Weird things start happening in search results
Usually you can’t just switch on or off links. Adding might work, removing is trickier.
If I add more, what will that indicate or if I remove, what would that indicate?
Patent says this: “The rank transition function provides confusing indications of the impact on rank in response to rank-modifying spamming activities.”
Matt Cutts says this: “we don’t want anybody to get caught by surprise”
What a BS statement that was…
Awesome. Now that’s how to enhance an algo! This is why it is very important to stick to the source: http://www.google.com/webmasters/
Interesting way to deal with spam. I have seen this before but never had an official line to put to it. Thanks for sharing.
There will be a substantial number of people who are not happy about what Google is trying to do but this is one way to ensure the quality of any site and to stop webspammers in bombarding the search engine with irrelevant pages that has absolutely no content.
I think I have seen some of this recently – on page changes having unexpected results.
Something basic like adding a sensible (non-spammy) H1 to a page that previously had none, causing a fluctuating drop, eventually followed by stabilizing at a higher rank for the keyword.
The part about drawing spammers out is pretty amazing stuff. Getting spammers or even grey hats to panic and do their “best” to counter will reveal even more spam tactics.
Thanks OP for this. Just ordered the full patent text. Google dance is now explained and spammers know how to spam smarter.
Thanks for sharing this. Goes a long way towards explaining some of the observations I’ve been making recently.
I know this patent was only recently granted, but who can say for sure this technology hasn’t already been applied since 2010 when it was first filed?
most people will not be happy about what Google is trying to do but this is one way to ensure the quality of any site and to stop webspammers in bombarding the search engine with irrelevant pages that has absolutely no content.
This is fascinating stuff…it also makes perfect sense that Google would use this tactic. Luring spammers into possibly further website modifications to counteract the apparent poor rankings and therefore making themselves more obvious to the algorithm! Brilliant.
Unfortunately the patent don’t go into details when it come to the “ranking factors associated with the document” especially which factors could put on the “red light”. Anyway with that method they will make seo much more difficult because you will never know if you are on the right path or not.
So as a Web Publisher who practices SEO I have now been classified as a “Rank-Modifying Spammer”!
And there are people in this comment thread calling Google “Brilliant”?
How is one supposed to follow Google Quality Guidelines such as
“Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.” if changing the words on your page will cause your site to shoot up, possibly down, or sideways for a unknown period of time?
What a bunch of crap, IMO. Trying to improve the quality of your site? Well, let’s drop you in rankings so that we can see if you’re a spammer!
Really Google? You can’t determine this in some other way? I can already anticipate the emails and calls I will get from clients now – “Um. I made those changes you told me to and now we’ve dropped in rankings”.
Even pointing a client to what Google says that this is part of their new way of catching spammers, they won’t care, and will they have the patience to ride it out until their rankings improve?
There are better ways to do this, all Google is doing is taking the easy way – “Oh, just follow OUR guidelines on what we think makes for a good site (i.e., make it easier to crawl, and detect potential spammy behavior) and you will have our blessing.
Like kissing the ring of the mafia godfather.
The most of the named above spam strategies are such that serious seo agencies don´t use anyways. It´s significant that Google´s last pinguin update hat an effect on no more than some percent of f.e. german websites.
While I understand how difficult it must be to try and stop “spammers”, many of Google’s recent efforts are going to hurt totally legitimate SEO. As others have pointed out, white hat SEO involves optimizing your on-site copy + legit link building. Most people aren’t going to see this post or find out about this patent, and may accidentally get labeled as a “spammer” when they try to decrease actions after a ranking drop or increase actions after a gain. It’s ridiculous for a business NOT to try to do everything they can (with an understanding of risk vs. reward) to rank better. And this patent seems to be pointing to Google not wanting anyone to attempt that.
Current SEO best practices should involve minimal attention to short term variations, and focus on solid tactics for the long term.
Oups. The system removed from my posts what was in brackets. I wa stalking about changes in H1 and H2 tags.
I think Google is using this technique not only against unnatural links, but against changing the content of the sites as well, particularly against changing tags. I’ve noticed on several occasions my pages dropped significnatly in SERP after I made changes in and .
While I think this is definitely brilliant in some ways, it certainly goes against Google’s recent PR push for more transparency. I am also curious as to how wide spread this will be, as I think it is silly to provide negative reinforcement for people engaging in “White-Hat” SEO techniques, and just seems overly paranoid of having there results gamed.
” For instance, if someone stuffs a page full of keywords, instead of the page improving in rankings for certain queries, it might instead drop in rankings. If the person responsible for the page then comes along and removes those extra keywords, it’s an indication that some kind of rank modifying spamming was going on.”
I think that is absolutely ridiculous. What about the person who made changes to their website to target what they thought were more relevant keywords etc…and then they see a resulting drop in rankings, and not knowing why, they roll back the changes they made, and therefore send additional signals that they are a “spammer”.
Agreed, spammers should be penalized. But I bet there wouldn’t be so many spammers if Google made it easier to bloggers to get the word out on their blog. People work really hard on their blog, and all they need is some notice. So, they resort to the easy way out which is spamming for links and what not.
Great Article, though!
Google’s been doing re-ranking of the initial result set for a long time based on all sorts of things including the number of sites linking to any of the sites in the initial result set and user behavior based re-ranking.. The idea that they throw in random variation or time based re-ranking of suspected spammer sites or algo confirmed sites to throw folks off their trail seems like a pretty interesting way to either confirm or weaken their spammer site assumptions… As I read the patent and chatter, I can’t help but think about how many sites may have fell into the trap and scrambled to ‘fix’ things when they received the unnatural links email warning last month. Google did say something like some sites were safe to ‘ignore’ the message. Maybe they were fishing to see what changes ensued as a result of the email and by making quick changes spammers and shades of gray were quickly revealed to Google. Did your SEO firm panic and make changes just in case it was this or that? How do feel knowing that Google may well have simply gone fishing to see what you would change?
Going through the comments, it seems like you guys have missed a key sentence from the article: “If those are determined to be spam, Google might investigate further, ignore them, or degrade them in rankings.”
This means this patent is a way of figuring out which webmasters are “hip to the game” and a way of keeping tabs on them. If they are doing something Google doesn’t like their rankings will drop, if they are doing something Google doesn’t like their rankings will drop but will eventually return and maybe go higher.
Thanks for the post post Bill!
Certainly, it will stop spammers for a while, but I think that spammers will find a way.
Personally, I’d rather not find it. I would also to Google so often not enforced on us buying AdWords;)
Regards
It is getting closer and closer to results being ‘random’ at any given point in time for newer, less established, websites without a significant ‘trust’ factor.
This is no surprise to me as I have long thought that Google’s ultimate aim would be to have as close to ‘random’ results whilst not appearing to do so. In this way more and more website owners would need to resort to Adwords – with the result of pushing the price for any given search term as close to its marginal value as possible.
The trick for Google is of course to maintain an ‘air of honesty’ whilst doing this, otherwise if people think they are being fed random (less relevant) sites, then they will be more likely to move to a different search engine.
The trick for us SEOs therefore also becomes one of trying to build the required ‘trust’ whilst not being obvious about it.
On a side note: it also seems like the ‘author’ aspect being used by many (especially around G+) to help with rankings could be a double edged sword. If your ‘author’ once gets flagged as a potential spammer, guess what, Google will know everything you’ve worked on (because you tell them) and can put that page/site into the spotlight too.
Great post Bill! It takes me a while to fully understand this patent.The main purpose of this patent is to catch spammers who manipulate serps deliberately. It sounds ideal and couldn’t be implemented in recent google algo update.
What’s really interesting is the addition of a new dimension (time) into the ranking. No longer are rankings exclusively a factor of the state of the internet now, but also how websites having been changing over time. Two pages/sites can have equal quality content, linking profiles, etc – but very different rankings because of their history (and apparently the way they’ve reacted to changes).
Nice info…And i hate spammer.
Who says that SEO takes immediate effect? Patience is one of an SEO’s virtues. I would never expect to see reactions just one or two days after doing certain changes. Those who are on edge and respond to unexpected effects with hectic counter actions are the ones who will be punished by Google.
Additionally as an SEO you often have the problem to correlate actions taken with effects on ranking etc.
I can see how this may have potentially been in play for a while actually. Thanks for sharing this. I have observed some ranking behavior similar to what is described here in particular this seems to occur with analysis of how social media impacts SEO. Social media signals are the most gameable signals to date, but they carry merit, so it only makes sense that in order for Google to properly apply such signals this type of algo is needed. I believe that the document uses the words “keyword stuffing” and “invisible text” etc to allude to social based rank modifying techniques while not giving any of the good ones away. Way to go Google. I am not concerned that this patent applies to all content improvements. If you add an H1 tag and some alt attributes for Wc3 compliance, I don’t expect Google would be foolish enough to apply this algo here. Rather I believe the engineers at Google will selectively apply this type of rank adjustment in cases where SPAMMERS are finding loopholes. This does however provide some merit to thinking about a gradual approach much akin to the analogy of “spoon feeding” Google, when it comes to improving your SEO. Thanks again for a great post! W.
Very in-depth read!
Google are really cracking down…
This has been suspected for many years now although it’s great to actually see it in black and white. I remember people warning against continuously change Title tags in 2004!
Why didn’t I see your blog before 🙁 One of my site already got banned from Google! I wish I could reverse the time man :((
I had committed a big mistake by adding invisible text and added over 300-400 metatags keyword..
“They’re smarter than they look.” LOL JON – they definitely are 🙂
Actually it’s like that: Google employs so many brilliant specialists that we as SEOs can be assured, that these guys probably know EVERY possible chance of manipulating search results – only thing that prevents Google from kicking more spammer’s asses so far is, that teaching the algorithm to recognize spamming precisely without bringing up too many false positives is a big challenge – even for Google. But the Google guys are clever and try to play their cards right – showing a mean pokerface while provoking SEOs to make mistakes – like they did by sending out the webmaster tools notifications for example. I guess they never exposed so many spamming activies in a compareable timeframe.
This new patent (if becoming an update one day) will absolutely rise chances for spam detection, but – like every other measure Google takes – can be compensated if you know what’s going on behind the curtain. But if SEOs do not panic and stick to best (post-panda/post-penguin) practices, this shouldn’t become a big deal – time based metrics eventually exist for years now. But like Jonathan from Splice Marketing said in one of the first comments, communicating certain movements in rankings to clients could bring up some problems – especially when dealing with impatient und unteachable customers.
Hi Bill,
From Google’s point of view this sounds brilliant. Essentially it appears that they are trying to determine who makes particular changes that signal flags based on the changes that are being made, and at the same time randomize reverse engineering of those changes to that business owners cannot effectively game the process. By smoke-screening the process to the owner more, they might be less inclined to make specific changes that would be viewed as potentially spammy since they can longer define a particular result set for those changes.
However, my concern would be false positives. Owners do choose to change their website titles. they choose to change H1 tags, add keywords to text.
If a business owner was to do this, and see no effect (but be considered in the ‘staging’ or ‘wait’ zone for monitoring from Google) then one could quickly spiral down in terms of knowing what is and isn’t SEO or a feasible approach.
I would suggest it all comes down to Google being more clear about what SEO is and is not.
It would be fantastic if Google would consider sending a basic comprehensive guide to business owners via webmaster tools so that there is more transparency in the process.
Interesting. I am happy with changes that help to minimize the Spam, but like everyone else I am worried on how this effects the white hat SEO.
I always thought there could be a bit of randomness after a change, possibly from the page being set aside for analysis. E.g. change something on page, Google notices it so puts some ‘pending’ flag for further analysis during which time the rankings could fluctuate until full analysis happens and it stabilises in rankings again. A by-product of all the different parts of the algorithm kicking in over time, never would have guessed that it was meant!
Interesting to see this as a patent, however the amount of surprise at this surprises me. I’ve had conversations going back years with SEOs who suspect that something quite similar to this happens. Likewise the concept of “don’t flinch – push on through the drop” seems to be a common principal in some of the low quality/high volume schemes that are around.
Leaves me wondering whether this is totally new, or building upon something that was already in place.
Google must fight against spammers in one or in another way, this is very logical. What I do not understand is that they are not doing it right as they should. For my opinion Google just has raised the number of spam after penguin update not the opposite as they say and because of this I am afraid that this will happen also in the future with other google updates!
Hello Bill Slawski, I completely agree with you because i have seen the same result on my website… but i couldn’t identified the petternt as well what is happening…
after reading this post i am in big worries… i have seen the same action about loosing ranking in gooble globally search result. my webisite is ranking on google.com 2nd position with web design company now it’s on 8th number… i don’t know why this is happening?
do you think repetitive task or participate in perticular place can cause the probelm?
Finally there is an official explanation for the Google dance. But making this information publicly available doesn’t seem like a very good strategy. A large portion of spammers will find out and adjust their strategy…
Hi Bill,
I don’t know about your side of the pond, but over here in France, your post is bringing up lots of debate.
What struggles me is the symptoms suggested are not new. We always suggested not to touch anything (besides serveur problems or hacking) when a website is troubled by Google. Especially when you know you’re guilty, do not go on remove right away whatever is meant to cheat search engines. This is true since I started SEO in 2003.
Question is why the patent is coming up now ? Timing seems very much in phase with Google’s willingness to increase SEO Paranoïa.
Victory for all companies who didn’t care about their website and rely on Adwords
Wow I’ve been seeing this happen for a few weeks now. Mainly with adding more content to a skinny page that is #8, hoping that it will move up. Instead it seems to fall a bit for a few weeks, then move back up from where it previously was. It’s been a repetitive pattern that I see over and over again.
Recommendations here point to becoming intimate with the Google Webmaster guidelines…read, reread, and take care with on-page optimization. Good advice for any SEO.
I think most spammers are already aware of this issue as it has been happening for quite some time already. There are only so many tricks Google can do to try and fool people but it all comes down to is the system will always be gamed like it or not. It just makes the barrier to entry for those who aren’t up to date on all the latest changes greater.
This results in me explaining to my clients: I promise I optimized your pages correctly, sometimes Google just likes to mess with you to make you think I ruined your rankings. In a few weeks, the rankings will improve. In the mean time, you still have to pay me though… lol.
Sounds like they’re two steps away from having AI gauge ranking signals.
The problem here is that even white seo tactics can be named as spam in some cases. Six months ago we could create a free widget with a link to our website. Now this can drown your website … Nothing is clear , the game rules are changing every month. I understand Google wants to deliver the best results, but now the results I see for lots of commercial keywords are bigger websites , small business that work with local seos are reduced to Google places and vanishing from google natural results. I hope this formulas will help mainly the user to get what they search for. But why a small business is not a relevante result? Because it doesn’t have links or enough social signals ? It’s not fair.
So Google need to file for a patent on a system that is destroying the way the search results have been working for a good number of years since they introduced the SEO method of ranking in the beginning. Now they want to destroy this and just make money from people by Adwords. Only problem is we only go to Google in the first place because they provide good results. This ridiculous notion that people doing SEO are spammers has to stop. But you know the craziest part of this patent filing is that Google somehow think this gives them an advantage? lol and need to protect such a system with a patent. When Google finally crumbles some years from now and everyone shifts to the next search engine, I am sure it is going to be one based on the early model of Google that we all loved when it first turned up. Just annoy enough people, esp. the key influences and tweets will lead the lemmings over the cliff and away from what was once a great search engine. Now it is rapidly becoming a jumbled advertisement and a wikipedia platform. What a useless patent, the very thing that will kill them and they think the next group would want to commit the same suicide?
Bill as usual this was a great read. With all that is going on in the world of SEO including Panda and Penguin updates I find it is harder and harder to figure out what works and what doesn’t as far as optimization goes. I have noticed exactly what you are saying as I have tried to figure out how I should be looking at SEO moving forward. I have seen a decline and then a bounce back.
As an SEO, I have always seen a delay time in seeing any document settle into Google’s index. I don’t expect this loop-de-loop card shuffling patent to hit all websites or new html docs submitted to the index. There are other flags that must be raised first before the scrutiny sting goes into motion. The problem with AI is that it is artificial intelligence. Google has expanded its human manual review (I do believe) & I would expect a method for requesting a human review if your site gets sandboxed. Best practices for optimizing is to make small changes over a period of time. Sometimes big changes are unavoidable when you are changing nav structure or site rebuild. SEO will adapt by being more deliberate in implementing changes to documents, and 3-4 weeks is always needed to show upward trend beginning in any case. This will impact webmasters who don’t have a firm grasp of best practices, but they have always been impacted. It will also cull out SEO firms who use black/grey hat methods. That’s good news for the SEO professional community. There are enough SEO companies who have harmed the reputation of the industry, that many burned small business owners in my area view SEO as being in the same league as spam. They hire me as a “webmaster” to fix the problems. Google is doing a great service with this patent.
Google will continue to implement new and new techniques in order to provide better content in front of searches. And this is the reason the have so huge impact in as a search engine.
That is why people making seo should be smarter to prevent ban to their sites.
very interesting post. I am sure that one of the main objectives of google is to protect their algorithm against reverse engineering, and this and other techniques must have been used during years to confuse the world.
They are clever, and the know that can not understimate our little brains working together (cloud computing).
During years thounsands of people have been analyzing the google algorithm and we are very close. They know it, and they will try all to be one (or more) steps fordward….
For Months already I see this kind of behavior in Google. Here and there when link building I see the pages drop first and after a while they will increase in ranking again to the positions I expected them to jump.
I think there won’t be any trouble if you keep on link building on the same pace as after a drop in ranking. Just keep the head cool and don’t react to heavily on individual rankings.
It’s funny though to see that Google still needs to fight hidden text and stuff.
This proves that SEO is dead and those who rely on this for a living need to step back and find alternative revenue .
Google is so determined to banish any sort of manipulation that it’s using psychological warfare .
Even issuing a patent for new algorithm could be fake to throw the SEO community of the scent
I found that sites which had no SEO and simply generate content that’s worthwhile rank far better than any that have had SEO it seems that leaving sites alone is best course of action?
E
It will be interesting to see how well this patent can differentiate between true spammers and legitimate SEO’s who are doing what Google has said to do all along, provide good quality, fresh content.
One of my clients is in a fairly niche market, and the sites dominating the search results (for competitive terms) are clearly violating Google’s ToS (mainly keyword stuffing and very spammy backlinks). Yet they remain dominant in rankings. It’s very disappointing to see that after three major updates, Google still doesn’t punish such sites enough.
The interesting X factor in all of this, is that if Google is in fact using this technique (and it’s fairly brilliant, so I don’t know why they wouldn’t be) what timescale does the “transition” happen over? Depending on the length involved, it might make it downright impossible not only to conduct “rank-modifying spamming” but other behavior Google might want to discourage, such as SEO experiments to identify and publicize ranking factors, or a well-resourced competitor reverse-engineering the algorithm.
But before we all get our collective panties in a bunch about Google calling us all spammers and that the lesson is “no one ever do anything to their sites that would increase their rankings”, consider that whatever time a search result is in a “transition ranking” is time where Google is in effect overriding their own search algorithm, and therefore not returning optimally relevant results. I don’t think they would want to spend a lot of time in that state, given that the relevance of their results is still the backbone of their revenue model.
Once again, I think the lesson here is that you have to be in SEO for the long game.
That is real clever from Google. Actually changes in the Google ranking algorithm have in fact been positive for bloggers like me, who do not know how to do spamming or similar tactics. I used to regret not knowing this all spamming stuff, but I am pretty happy now that I never knew properly how to do all this stuff. My ignorance proved to be a blessing in disguise. Thanks Google:-)
In the wake of all the changes Google has been making, this patent gives a little insight on how to combat those changes. Simply use only white hat methods. I am curious do web spammers still use cloaking, keyword spamming and other tactics? Or are most of the spamming methods used today mostly link manipulation? Doe anyone have examples?
Great job, Google, keep bringing the witch hunt and screwing the innocent website owners over as you do it! I look forward to outsmarting you and your silly a55 algorithm changes to get my non-spammy sites ranked as I have consistently and without fail for the past 8 years. Just another round of cat and mouse that you will lose. /Yawn
I think that you are slightly off-target on part of your analysis. Part of the patent text explains that Google is going to watch for other cahnges to ranking factors during the transition time, and you are assuming that those changes will be in the form of further actions taken by SEOs. This is paranoia.
I think that Google is using this method to see what happens to other ranking factors (like those related to traffic such as bounce rate, page views, referral traffic, etc) when a rank is transitioned. Google wants to see if the website in question relies on SEO for all of its traffic. For example, if bestbuy.com were to randomly drop to position 45 for some keyword, Best Buy would still have lots of traffic because it is a legitimate site that people will return to and that other parts of the internet will refer people to. If this happens to a spammy affiliate site, Google will be see that their entire traffic economy relies on rankings, and it will choose a third rank for the site much lower than previous.
It would be very small-time for Google to invest in a patent to confuse SEOs. I think we can safely assume they are looking at real data and the bigger picture and ease up on the feeling that Google is out to get us.
@ Laura,
No assumptions on my part. I recommend that you read through the patent if you haven’t already. My analysis says that this could apply to any changes at all that might results in a positive change in the rankings of a page. The patent tells us:
So, if changes have taken place on a page that can cause that page to rank higher, Google may monitor the page for additional changes during the transition ranking function stage to see if there are positive signs of spamming taking place. I didn’t distinguish in my post whether those were the actions of a spammer, a webmaster, or an SEO.
The purpose of this patent is not to confuse SEOs, but rather to confuse anyone who might be attempting to spam their search index. The patent is pretty clear upon that:
Thanks.
@Bill
I thoroughly read your article and the patent, and I think we are on different pages. I meant that I don’t think Google’s main purpose of this part of the algo is to watch people (whether they are SEOs, spammers, webmasters, or all three). I think it is to collect other data that will manifest itself even if the people disappeared.
For example, Google could learn that the page transitioned to spot 90 loses 100% of its traffic, which would mean the site has no direct traffic and no referral traffic, and that could be a signal to Google of low quality content. There are lots of signals like this that would change as rank was experimented with, and they would change even if the SEO/spammer/webmaster was on vacation that month.
I hope that clarifies.
Hi Laura,
I really don’t understand your interpretation of the patent or my post. The patent is clearly about how Google might respond when there are changes to a page that would result in that page increasing in rankings for particular queries or terms, and has nothing to do with Google monitoring loses of rankings or traffic to determine that a page might be low quality.
There are several events here. The first event is a change presumably initiated by humans that results in a change to ranking factors. Maybe your company gets lots of new backlinks in a short timeframe. This is when Google’s rank transition patent part of the algo kicks in to put your website to the test.
Instead of just giving you a higher rank for having more backlinks, they mess with your rank a bit, both increasing it and decreasing it, and sit back to watch. The results of this experiment are the second set of events, described in teh patent as further changes to ranking factors. Sure, Google COULD be doing this to see if your SEO firm flips out and abandons the link farm (or joins another) but it makes much more sense to me that they are monitoring something else.
I am proposing that the second set of events are not a human reaction to the transition phase. Here are some example scenarios that reflect my point:
A site that still gets traffic even without Google traffic is likely to be a more helpful site. When Google lowers a page’s rank as an experiment, they learn more about their traffic patterns.
Similarly, if a site from rank 20 is experimentally shoved into spot 2 and people aren’t bouncing out at a very high rate, but are stopping to read articles or buy products, it could mean to Google that the site deserves the higher rank. When Google gives a page a higher ranking, they learn more about how people use the website.
If these sort of factors are considered “ranking factors” then ranking factors would change during the transition time without any tactical changes by humans.
Hi Laura,
While that might be something that Google might do, it’s not something that the patent is saying they are doing. You’re adding to what is covered in the patent with your own assumptions. Not a flaw in my analysis, but rather some assumptions on your part.
Fair enough, Bill. Did not mean to offend. I just think it is worth thinking about.
This is time to say bye bye Poor links, we must follow the recent Google quality guideline according to Google Penguin and Panda updates. Keywords stuffing, anchor text links become odd and we can get anything doing this also we need to forget “PR” and Dofollow, Every links are important if they are coming from authorized sites
It’s definitely worth thinking about. It’s very much likely that if a page’s rankings drop that there’s a good chance that traffic will drop as well. Can that be used as an indication of the quality of the though? I don’t know that it can.
The patent does tell us that it might randomize the rankings of pages, and that might include moving a page up much higher in results, such as the # 2 spot you mention. Google could look to see if the page does get a corresponding bump in rankings when it’s listed that high. Most patent descriptions don’t provide a full roadmap of the processes that would be used under them, and in this case we really aren’t given details on how Google would access changes to pages after a transition rank function would go into effect. But it does seem like Google is waiting for responses from the person in control of a site to make changes rather than assessing the impact of changed ranks alone. Google wouldn’t need a transition ranking function to do the kind of test you describe.
Hi Bill,
As always I enjoy your posts.
The language used in the patent is a bit vague by GOOG.
I see spam working in the SERPs, but to read this article it may scare folks not to engage in SEO.
I wish explicit examples and definitions of spam were also mentioned in the patent.
Thanks+
Hmmm. So I could hose your site with links, wait for the ‘transition’, then turn the hose back on. Won’t this tank your site? The G won’t be able to tell the difference (still in the dark).
Btw, white hat seo is just a shade of black – you’re still gaming the system. Don’t kid yourselves that the god of the SERPS will look down favorably upon your adherence to the written word of the webmaster’s guild – G is one tempestuous bitch.
Wow, Bill. This is a real eye opener. It seems like everything eventually comes full circle with Google. Hopefully there’s no one left out there who thinks you can build a stable business using Google’s organic traffic only. It is so obvious that everyone would be better served getting traffic and customers from every where other than Google. If you want traffic from Google just keep on producing new relevant content and focusing on social sharing, it seems.
Great job on your analysis as always.
Darren
Its funny to me that Google has been really ruined there search as much as they have. I’ve notice some niche the same site has 20 of the top 40 spots for keywords.
I love how they want to penalize people for dupe content. When a large part of there business models is scraping and indexing.
Nice work and effort. Out of all of the elements mentioned, I find Link based manipulation to be the most interesting, everything else is pretty obsolete and most of the dark hat community is focusing on paid links and the sort which is more complex in nature versus the other items like small text and keyword stuffing. Since links are a focus of this patent maybe it will be associated with a penguin algo update?
We have seen page positions moving randomly, then going back to the initial position, for years. I’ve never been obsessed with positions, so every time one page I follow get a lower rank, I tend to do nothing if I’m confident the page has enough quality.
The Google strategy is quite smart to detect private networks, but this is now known by spammers – so what are they using now?
The most damning conclusion of this patent is that Google is relying on a direct and unswerving correlation between ranking changes and “spam”
That sounds highly suspect logic, liable to include a load of false positives and seems to be solving the problem the wrong way round.
Read about this first on Aaron’s blog and linked to this. I added a comment there and will say the same thing here:
I once tried making my own web-based rank checking tool back in 2007, it was more of a scraper, not using any APIs and was made in PHP. I didn’t even use cURL (http://php.net/manual/en/book.curl.php) I simply used the PHP function file_get_contents and it was running well while testing and programming and was done by the next day.
After checking the script the next day, it was still working but the results seemed incorrect. Then when I waited more than a week to a month, I noticed my results were not changing at all. And looked way different from the SERPs.
Now one thing to note is…
1) My script had no caching system in place.
2) The results were the same even in different views on different ISPs on different days so it is not something cached on my ISP.
3) The function really pulls the source code of a Google SERP page, and if a connection is not successfully establish, the result will be blank.
4) The results seemed to purposely been custom to my script alone that the results will never change. But instead of blocking my script, Google purposely kept pushing a page to my script but with false rankings.
Ever since then I said, I am not going to make my own rank checking tools, so much to do… from modifying user agent, rotating IP addresses, adding in pauses to emulate human behaviors, etc.
I looked at the patent on the link above with the 2012 date, but it was filed in 2005. And I think I have seen this in action already when I was trying to make my own tool in 2007.
The patent does tell us that it might randomize the rankings of pages, and that might include moving a page up much higher in results, such as the # 2 spot you mention and also Its funny to me that Google has been really ruined there search as much as they have. I’ve notice some niche the same site has 20 of the top 40 spots for keywords.
“This patent really makes sense and I feel that they have been doing this for years. When looking at historical ranking reports for keywords we are targeting we almost always see dips before rises, but the trend is always upwards.”
Seconded Steve! There is no doubt it’s like looking at a lake in the midwest during the fall season, the Churn makes clarity non existent.
Without hard data backing this up, I firmly believe is happening heavily in positions 2-50 right now. This is just my takeaway from monitoring 10,000+ keywords on a daily basis.
A lot of over analyzation going on here. It’s a simple reason they do this imo, it’s more reliance on PPC ads, as any commerce company now has less confidence in consistent rankings and it’s less transparency on effectiveness of marketing strategies in realtime. PPC ranks fluctuate much less than organic rankings do, except for brand-related queries.. of course!
Smart filter indeed. the “quarantine” might feel a bit unfair for genuine and non-artificial growth. I guess solution is: don’t panic after fluctuations, hold onto your strategy regardless of the short term effects. “Keep Calm and Optimize” still applies.
This is a really interesting concept but I can’t help but agree with the others who have voiced serious concerns. Often clients expect to see a change for the better or at worst no change. If there is a possibility of a site dropping in the results then it does propose a problem for white hat SEO companies or even people who are legitimately trying to improve their own site.
I for one know that I have knee jerked in the past and removed changes. I tend not to do that so much now but let’s face it, it’s a natural reaction.
I realise that Google has to combat spam but I’d be more concerned about the state of the local search results at the moment that consist largely of directories such as Yell, Qype etc. Not what I would call the most relevant results.
I could’ve sworn I commented on this before…
Anyways, it sounds like Google’s trickery could actually work in the cases of spammers using black-hat SEO methods to increase their pages rankings. they see a negative change, revert the page back, Google watches them and ‘confirms’ that the page is spam. But how does reverting a change to a page prove that the page is spam/low quality?
Couldn’t this also detrimentally affect webmasters just trying to get a grasp of SEO, or who are just trying to get their pages to rank well (without the purpose of spamming/black hat methods)? I can see it scaring folks away from optimizing “too much”
I know of several sites that have been penalised by Google as spammy and in fact they are high quality sites with good content being posted on a regular basis? Not black hat techniques at all. One particular site had 40,000 spammy links from some company in Brazil which were never removed even after repeated polite requests. Consequently the site was penalised and dropped considerable page rank. Pretty unfair!
it took google 5 years to get the directories under control, only last year they caught up with the worse of the worst article directories and blog networks … I’m not too worried; just stay away from cheap crap and you’ll be fine.
So does this mean that the best way to move forward would be to start with a solid plan and not make any changes to your methods afterwords? Like “that’s my story and I’m sticking to it”… Maybe this is an indication that Google may not have the means to truly filter spam from the internet so they are taking the easier route of identifying and targeting the actual spammers and their web properties.
Man this was a great article. My brain began to melt when I was reading the patent. Great explanation. I’m giving you a big plug in today’s video blog. 🙂 I wanna be an “SEO by the sea” conjours up wonderful images
Hi Bill, question about implementation of the patents, has this transitional rank process already been in use, or should we expect to see it only now the patent has been granted?
Wow Bill – this is the first blog post that explained the Google approach in so much detail. Thank you for that! Although I have been blogging for a while I still consider myself technologically challenged in many ways. I had not heard the term “invisible text” before, so I don’t know if I use them in any of my blogs or not. I am hoping to come to the next Piedmont Bloggers Meet-Up meeting so I can meet you in person and start to get active with exchanging ideas with other local bloggers. Sorry to say my doctor recently told me not to drive anywhere, so I am in the awkward position of trying to get people to drive me places when I need/want to go somewhere. Living out in a rural area makes it a challenge! Anyway, I appreciate your blog very much, and will be recommending to other bloggers that I know that they read your blog as well. Seems to me that you have a lot of information to share with us “aspiring” bloggers!
Glad I didn’t waste my time with article submission sites!
Google is on the offensive and I’m sure everyone has seen something interesting happen on their site.
The page redirect information was interesting. Wonder how big sites will thousands and thousands of redirects will fare?
My biggest issue is that it now seems you can just spam your rival with links and get them taken out which is so unfair. Im all for quality content and building authority sites what are the best ways to protect a site going forward with so many changes and is it worth the effort!!??
This is potential brilliance!
The way I see it, they look for pages that suddenly surge in the rankings. This surge might be due to any past, present or future page rank manipulation technique.
Of course there could be a legitimate reason for a page surging in the rankings. But I’m sure they can cross-reference it with quality, or use the “block this site” data, or just do a manual review. Of course they also have reams of historical data to play with.
As far as negative SEO, this wouldn’t work. More links would make a page temporarily rise in the rankings, but then it should then go back to where it was before.
This is just a genius response from Google and a real spam-killer. I can also confirm from different forums I frequently visit that people have seen sudden increases in rankings by doing some link-building and then went back after a short period of time. I think Google is very close to render most, if not all unethical link-building techniques useless.
Very good and interesting read which confirms what I’ve been seeing.
Thanks for this post. I think that it will drive more business to smart SEO companies. It will drive more business because the rankings appear to be more random and black box-ish. So the everyday webmaster won’t know what is going on, pushing a company to some form of consulting. The smart SEO companies will benefit because they will be able to clearly describe the process and short-term outcomes of the work to be done. The SEO consultant has a list of good white-hat techniques that have been proven to work. They will slowly implement against the checklist while the SERPS dance. Knowing that in the long run the client will be happy with the work.
These patents are important but I think Google already changed the whole SEO industry after the latest EMD algorithm update. They are fighting against spammers hardly but I still see people who could find a way to increase their ranking signals with low quality content but huge backlink profile with PR.
I have a website that ranked on #1 SERP last week for many keywords. Although it is still part of G-gle index AND I received no messages in WMT, the website doesn’t rank for ANY of the keywords any more. It has dropped off any any ranking, nothing at all, I can’t believe it. How can g-gle do this to businesses and people, really bad!
Is this possible guys?
when you lie and cheat, yu know that you’re lying and cheating, so you know it will hit you sooner or later
sure enough, as a genuine person you need to be sure to stay out of the fire line, but honest people getting hit by google by mistake is rather rare, let’s be honest about that; I can say a lot of shot about google, but they do get it that the knife has 2 sides; they stay within lines pretty well; I’m not worried
Its really nice to see such changes in Google…
Earlier people used to do lot of tricks like building backlinking by spamming, automaters etc. but now its not that easy for spammers to survive… ya ya Google is working like a Military – Finding and Killing 😀
The only thing is… it also punish innocent people who are new to this challenge as they are not aware of Google’s Policy.
Nice work Bill.
Cheers,
MJ
Hi Michael,
That’s why Google publishes their webmaster guidelines, provides webmaster tools, has lots of help and support pages, makes videos answering webmaster questions, has webmaster help forums, publishes blog posts about what they are doing.
I think this is idiocy! Someone is thinking way too much. G apparently does not look beyond the end of their noses or think outside the box. I can think of dozens of cases which can throw this algorithm on it’s butt, without any mal-intent at all. For instance the home page of a small (un-respected, by G) hometown newspaper which changes at least daily based ‘randomly’ upon the daily news, not on any attempt to game the system, but the topics come, build and then go away. So by chance, one day it mentions the town name once, the next day 4 times, then the next day twice by necessity, not with regard to looking at rankings. That could coincidentally correspond to their gaming algo. Entertainment magazines (who is suddenly in the limelight this week, gone next week), and on and on. At the very least they would need to incorporate recent local trends to see if the topic is trending.
Or constantly changing pages of sports scores, racing reports, Weather reports, TV listings, stock prices/stock pick newsletter recommendations. Sometimes coincidents DO happen. Heck what about plain old blog pages with additive comments which can change between crawls and trend toward a central or deviant topic? Sounds like a perfect competitor attack. Does G pretend they can identify every custom home-programmed blog application they aren’t familiar with as well? They are obviously thinking one-dimensionally and straining to force every site to conform to THEIR idea of the ideal site (so it is easier to incorporate into their new “INFORMATION ENGINE”), squash originality and probably someday start their own line of expensive webpage generation software with pre-set templates required in order to rank decently. At that point we finally become unpaid information mining slaves.
It seems clear to me that instead of trying to confuse & mislead webmasters Google needs to be more clear about do’s and don’ts in their Webmasters Guidelines.With regard to Negative SEO.A senior Google representative is quoted as saying that you should “Immediately change your websites URLS for the pages being attacked.” Where does that leave us with Googles “Rank Modifying Patent” & the Penguin Bird that attacks your website as a result of those SPAM backlinks.Frankly,Google are making themselves look ridiculous & unprofessional.They are losing the respect of all webmasters on the Internet fast,and they cannot redeem their actions.Its all about greed for money and advertising,with very little thought for webmasters who may have spent six months of their time or more developing a beautiful & professional website according to “Googles Webmasters Guidelines.Creating this kind of bad feeling will only create more problems for Google who now make Bing & Yahoo look like a safer designers option.So now you can develop a website,but never make any changes to it,apart from updating new content.Frankly,the most poorly designed websites are now at the top of Googles results on “Pay per Click”.We need to pay our bills too, Google are making webmasters look ridiculous in front of their clients.Finally not everyone can design a professional website ! Thanks for wasting our time Google,you will see our websites on Bing !
I am certain millions of webmasters now share my sentiment !
Hello SEO people. Congratulations to Ross Koningstein for his new patent.
Just my two cents worth: I wish your product was not able to be so easily gamed.
Google aren’t quite there yet. An example is a payday loan company that has experienced negative impact rankings based on negative SEO from illegal competitor sites, and there own sites are clearly poor quality and spammy, but still rank highly.
As always, another vague, ambiguous and unintelligible rule that will create an environment of arbitrary and capricious loss of rank and scare an honest individual from making improvements to their pages, such as adding things that will improve user experience like 1. breadcrumbs, 2. semantically related and improved content, 3. removing overabundance of spammy exact match anchors, disavowing links from negative SEO attacks (rising daily now), etc. Of course, it is bound to (in the mind of Google) to increase PPC. But I no of few people who can afford $80 per click. So we shall see how this pans out soon enough. As a Goog stockholder, I think it hurts the company anytime people rush to Bing, such as when a Wikipedia result beats a better result, more relevant to the query.
Hi Michael,
If you haven’t, I would recommend reading the patent. Chances are extremely good that making changes that improve the quality of web pages, such as adding breadcrumb navigation, adding improved titles and meta descriptions and heading, adding fresh content, removing overoptimized links, and so on are likely to not go through a process like this.
I also sincerely doubt that the purpose of a patent like this is to increase ad spend on Google. It’s to stop people from manipulating search results, and I think it’s a very good idea.
Bill: I appreciate and respect your response. So much so, that I actually included it as part of an article I wrote on this very subject. As an attorney, I am trained to look at the underpinnings of motives, and actions. So yes, I have read the vague, ambiguous and unintelligible patent several times, and this article over at the Circle of Legal Trust is the culmination of my theories:
http://circleoflegaltrust.com/new-google-patent-lawyer/
Hi Michael,
Chances are really good that Google has been following this practice since at least 2004, since the patent I wrote about is a continuation patent for a patent originally filed in 2004. What was interesting to me about it wasn’t that it disclosed anything new. If you have been practicing SEO instead of law from that time to the present, it’s possible that you might have seen Google respond in a manner like that at some point, or have seen one or more people who wrote about situations like that in forums or elsewhere on the Web.
I’ve recommended and/or made thousands of changes to websites in ways that improve the quality of those pages without seeing either negative or random results as described in the patent in question. But then again, I haven’t been scraping content from other sites, stuffing keywords in page elements, hiding text on pages via CSS or java script or through really tiny text, using cloaking or java script redirects to engage in cloaking to show searchers and search engines different content, buying links from anyone including private blog networks, or engaging in similar tactics that might put a business in a situation that might put its continued existence at risk.
Hi Bill. Actually, I have never really practiced SEO. But I do practice law 7 days a week. And part of our practice is copyright law, unfair business practices and class actions. So these all potentially cross over into SEO and the practices of Google/Bing and Yahoo! One thing I also practice daily, however, is brand building. Being a Marine, I look out for fellow lawyers, who are my comrades, who have gotten screwed over by so called “SEO experts” who charge thousands of dollars for so called “site audits”, so they can tell you what you already know, like your site needs a better “call to action”, etc. Like any great general, one must learn the complexities of the battlefield, in order to be able to choose the most qualified field commanders, or in this case, the right person to help for their sites. So far, you, AJ Kohn and a few others are the only people I would recommend. In any event, I can instantly identify a Charleton due to my advanced knowledge about SEO, etc. And the purpose of the Circle of Legal Trust is to help other lawyers identify the same thing. Most so called SEO people assume attorneys are a bottomless piggy bank, speak in parables, and act as though they know some secret no one else knows. But I do appreciate your comment, and DO see what you are driving at. But please don’t assume that I spend my time “practicing SEO.” I practice what I call “Defensive Branding” and it is a corollary to my law practice. If the phone does not ring, I have no law to practice.
Makes it tough to implement changes… and then correctly assess the impact of said changes. Ack!
Interesting post about Google’s continuation patent. I agree with the practice of developing quality pages. But it has been occasionally tempting at times to use less reputable methods like keyword stuffing in a web page or JavaScript cloaking, especially because I have seen other boats rise with them, at least for the moment, sometimes a long moment. I think it can become frustrating for folks playing fair at how long it takes Google to respond with new code and patents, example off the top of my head, Google allowing what I think of as non-sense like using keyword stuffed domains names.
All The Best,
Sean
@Andy I agree. Could be good, could be bad. Always scary when a scaled algo has the potential of tossing the baby with the bathwater.
Why anyone would try to engage in anything but white hat SEO these days is beyond me. The days of taking shortcuts should be in the rear-view mirror…quality in, quality out, work every time.
This is really interesting – the idea of trying to find spamming and manipulation based on people making rapid changes to a page. I do wonder, though, what happens with innocent people who are just trying to optimize their pages legitimately. I did a whole series of changes during December based on using the SEOMoz free software (the 1 month free trial). At the same time, I tried to optimize titles and reduce outgoing links to my 2nd site. Thankfully in January, the hits to my site really started to increase, but I do believe that some of that was more due to seasonality.
Hi JohnH,
Waiting seems to be the response under the patent. This patent was originally filed years ago, and if Google used it at some point, it’s very much possible that they replaced it with something else, like Panda or Penguin, that might better target behavior that focuses more upon what Google considers to be quality signals. Not too long ago, Matt Cutts was in a video stating that “just because Google published a patent on something doesn’t mean that they are using it currently,” and he specifically referred to this patent.
The churn and burn behavior you describe has been around for a long time – its not a new big thing, but it is a risky way to run a business, expecting that at any moment Google might cause your site to lose some or most of its rankings and traffic.
So what is the bottom line here? If your rankings change after you have made a change wait Google out? No wonder the new big thing is churn and burn web sites that rank well for a few weeks then disappear. I guess with the costs of domains as low as $3 each, that makes good business sense.
Thanks for your reply Bill 🙂 Google also might cause your site to lose some or most of its rankings and traffic even if you do the “right” thing as many businesses discovered earlier on this year. In my opinion relying on Google in any way is a risky way to run a business. Now I know how farmers feel worrying about the weather !
The thing with churn and burn is you can spread your risk across multiple sites for very little money. I calculate it would only cost about $500 USD to register and build 100 web sites including domain names and hosting. And if you know a little bit about MySQL, it only takes 5 minutes to completely clone a site to a new domain name.
@JohnH – I have experienced this first hand. After being in the number 1 position for more then 4 years, earlier this year we dropped away and are now fighting the fight. Business took a turn and it would have definitely been helpful to have not relied on Google. After such a good run it became to easy.
@Bill – Great article. If only we knew then what we know now. I have bookmarked you and will be back for more SEO articles. Thanks.
Thanks, Amanda.
Sorry to hear about the increased competition that your site is facing. It probably isn’t a good idea to rely upon Google for too much – finding diversity of traffic sources for your business is a good idea.