Are you a robot? A spammer? A sock puppet? A trusted author and content developer? A trusted agent in the eyes of Google? (More on trusted agents below.)
When you interact on a social network, or write a review online or update information to an internet mapping service, how much does the service you are using trust the content that you add, or the changes that you might make?
These aren’t rhetorical questions, but rather ones at the heart of approaches from services like Google Web search and Google Maps, which are focusing more and more upon social signals and social collaboration to provide the information that they do to the public.
If you’ve seen a +1 button within Google’s search results or on a site, and you’ve clicked upon it, or shared a page or post or site in Google Plus with others, you’ve engaged in endorsing the work of the author who created that site. How much weight does Google give that endorsement?
If you find an error on a Google Place page, such as an incorrect phone number or bad street address, and you take the time to try to correct that, what process might Google go through to decide if you’re telling the truth?
Google’s Crowdsensus Algorithm
In a whitepaper from last year, Reputation Systems for Open Collaboration (pdf), Bo Adler of Fujitsu Labs of America, Ian Pyey of CloudFlare, Inc., and Luca de Alfaro and Ashutosh Kulshreshtha from Google describe two different collaborative reputation systems that they worked on. One of them is a WikiTrust reputation system for Wikipedia authors and content, and the other is the Crowdsensus reputation system for Google Maps editors.
Both systems are interesting, and as the authors note, both fulfill very different needs in very different ways. The overview presented about Crowdsensus, and the needs that it fills and how it differs from the WikiTrust reputation system presents an interesting look at how Google might approach other reputation systems where they might not want to explicitly share the reputation scores of people who participate.
While we are given some hints in the paper about how Google might use reputation scores when looking at edits people provide on business location information in Google Maps, we aren’t provided much about how those reputations are calculated. We are told though that the use of reputation scores result in much smaller error rates than in a system that doesn’t use them.
I did write about another approach that Google might follow with edits to business locations for Google Maps last month in the post GPS to Correct Google Maps and Driving Directions as a Local Search Ranking Factor?, which looked at a Google patent titled Trusted Maps: Updating Map Locations Using Trust-Based Social Graphs, which also uses a collaborative trust/reputation approach in how it weighs edits to Google Maps.
Not All Google +1s are Equal
I’ve written about Google’s Agent Rank here a few times recently, and Google published a new Agent Rank continuation patent application last week which expands upon one aspect of the patent filing within its claims section.
Within the description section of the Agent Rank patents, we are told that:
Not all references, however, are necessarily of equal significance. For example, a reference by another agent with a high reputational score is of greater significance than a reference by another agent with a low reputational score.
Thus, the reputation of a particular agent, and therefore the reputational score assigned to the particular agent, should depend not just on the number of references to the content signed by the particular agent, but on the importance of the referring documents and other agents.
This implies a recursive definition: the reputation of a particular agent is a function of the reputation of the content and agents which refer to it.
The claims section of the newest version of this patent is transformed to focus upon this aspect of Agent Rank. It introduces the concept of “trusted agents,” who might endorse content items created by others.
That kind of endorsement can increase the reputation score for the creator of that content.
The patent doesn’t explicitly mention things like a +1 of a page or content, or the sharing of a page or Google Plus post, but the framework that it presents is one that could easily encompass those types of activities.
The patent doesn’t elaborate upon who a “trusted agent” might be, or how someone becomes a trusted agent. Is Google using Agent Rank as part of how they develop reputation scores for people who are creating content associated with authorship markup, and the digital signatures those provide, as well as people who post at Google Plus?
When Google initially announced that they would be looking at Authorship markup in June, they told us that they helped integrate it into a number of sites such as the New York Times, the Washington Post, and CNET. They also added the markup to everything hosted by Blogger and YouTube, so that everything published there would automatically include the markup when published.
While doing that, Google introduced digital signatures to a large amount of content on the Web, which is a good start towards introducing Agent Rank to the Web, and the use of reputation scores in the ranking of content on the Web, as well as a way to help identify who the original author of that content might be.
Here are some of my earlier posts on Agent Rank and reputation scores from Google, which may help to provide some background details on how Google might use an Agent Rank/User Rank approach to integrating the concept of reputation and trust into Web search rankings:
- Googleâ€™s Agent Rank Patent Application – On the original Agent Rank patent, and its potential use of digital signatures to associate content with the original creators of that content.
- How Google Might Rank User Generated Web Content in Google + and Other Social Networks – Which looks at the User Rank approach that Google has been developing with its “Confucius” Q&A sites, that looks beyond shares and +1s to contributions from authors and their meaningful interactions with others in an automated manner that can be used to develop a reputation score for people.
- Author Markup, Schema.org and Patents, Oh My! – About how Google was enabling people to use HTML markup to “claim” content that they have created.
- After Authorship Markup, Will Google Give Us Author Badges Too? – Google profile images appearing upon your pages, and Google Plus Badges sound very similar to the badges described in the patent I wrote about in this post.
- Agent Rank, or Google Plus as an Identity Service or Digital Signature – When Google’s Eric Schmidt noted at a few public events this past summer that Google Plus wasn’t a social network, but rather an identify service, it seems that response wasn’t so much about the use of anonymous names on Google Plus, but rather how digital signatures might fit into Google’s future.
- Google’s New Freshness Update: Social Media Has Changed the Expectations of Searchers – One of the ways that a search engine can identify topics and queries that are trending is to keep an eye on social media services like Google Plus. It’s quite possible that topics noted by people with higher reputation scores might be given more weight than people with lower reputation scores.
One of the things that I’ve been watching in Google Search results is how often I might see an author profile next to a search result from Blogspot, since Google supposedly integrated authorship markup for content at Blogger. I don’t recall seeing any yet, and I’ve been wondering why.
Is it because I’m just not seeing any blogspot results, or is there some kind of reputation threshold that needs to be met by the authors of those posts before Google will start showing them?
I know more than a couple of people who have added authorship markup to their pages, and haven’t started seeing authorship profiles next to content they’ve created when it shows up in search results. Is that because they need to meet some level of reputation first? Is it because Google has purposefully limited who it is showing profiles for at this point, and deciding upon where a reputation threshold should be?
Is it a question of trust on Google’s part?
Are reputation or user rank scores influencing rankings in search results at present? Chances are that they may be in the future, if they aren’t now.
How does one become a “trusted agent?”
Added November 29, 2011 Highly recommended that you check out Justin Briggs’ post Building The Implicit Social Graph, which takes a thoughtful look at how Google is exploring the relationships and interactions between people on social networks. As Justin concludes there:
Itâ€™s no secret that the social graph appears to be the next evolution with increasing uses of social factors, social elements in search, and mechanisms that will lead into AgentRank/AuthorRank, which will tie directly into the implicit social graph.
116 thoughts on “Are You Trusted by Google?”
Nice discovery! I’m curious how did you find the USPTO filing?
Anyway – this sounds like Google is building his own Klout. Does that mean we should bother with our Klout scores?
I’m guessing social shares are apart of it. How many real people who follow your niche engage with your content? I’m just as excited to follow this as you are.
This same question arose when I checkout your post/presentation last week about social ranking factors. It all makes sense. I guess, my answer is, a trusted agent is someone who shares quality and “trusted” information on a regular basis. So if I constantly +1 pages that are already highly favored by the traditional Google ranking algorithms, then I must know what I’m talking about and my “trusted agent” score or what have you increases. Then in time I might be able to influence a new site/page and give it a boost in rankings because of my “trusted agent” status, as Google continues to tweak and give agents and +1’s more weight on rankings.
The core flaw in Google’s algorithmic strategies — this reliance on PageRank-like models in nearly every approach to sorting through content — is that it creates Value Bottlenecks by favoring sites that benefit from the Law of Preferential Attachment. The unspoken assumption is that “quality attracts quality”, and that’s just not the case. Quality is very subjective and showing users the preferentially attached targets over and over again skews the searchers’ perception of what is good quality.
So if you base your trust algorithms on these elitist mechanisms, then your most trusted agents form a small cadre of users who can (and most probably will) get away with just about anything.
Trust must be ascertained on the basis of behavior; otherwise it is too easily manipulated, just as links and other citation-based methodologies are too easily manipulated.
This reminds me of the Open Directory Project – DMOZ – and their [trusted] editors. Despite good intentions, the quality of the directory results was sometimes negatively impacted because editors were after all merely human. Besides a tendency to become pompous all powerful beings reigning over their categories, because of their power they could skew results for their friends by eliminating their competitors. Will this be a matter of “with great power comes great responsibility” or will 500 bucks get you a +1 from a trusted agent with a high index? I know some of the big box SEO companies will try to manipulate this possibility for their $5,000 a month SEO clients.
Love it. Nice work Bill – incredible post, well put together. I would theorize becoming a trusted agent is simply being involved in the “social” community and being tied in through relevancy either through how you contribute or how your closest connections contribute to that topic. Almost degrees from the most relevant, authority of the vertical. Being the most relevant, authority of the vertical shouldn’t necessarily deviate from the existing algorithm.
It’s almost becoming comical seeing all these various ways that Google is trying to keep up with the web and apply an algorithmic solution to them.
It certainly keeps all of us in business, but I’ve already had clients feeling overwhelmed by all the items that they need to maintain. (Course, I offered to do it for them…for a fee!)
Awesome post Bill! I came across your blog about a month ago and am very impressed by the quality of your work. Good stuff!
I think to become a trusted agent in google one should concentrate on the content what he is posting .It lots depend on the posted content ,the quality and uniqueness of the content you post …whether you are doing a blog comment or an article submission you should not spam .just give the best possible content there.
What’s interesting to me is the dichotomy between how Google displays these faces and the actual likelihood they do anything with them. I think it’s very possible that’s the case, even though “doing” something with them could mean only using CTR impacts to change the actual SERP, which may allow it to be implemented with any fears of how the search result may change. So it could offer a bit of a “visual” cookie with SERP impacts, but only on the surface, not below it.
Mikhail, I agree with that. It is the best place to start, as there is no point in just filling a site with garbage content even though I have to deal with it all the time with clients. 🙂
Bill the Crowdsensus Algorithm is interesting, I never heard about that being used before. That is a really interesting way to weigh trust.
Agent rank sounds like it could be a good thing. Like some of the other commenters here, I think ultimately good content is still the key. Good content attracts attention from social networks. It will always be Google’s job to return good relevant content. The end goal hasn’t changed and never can. The big question is, will it be open to abuse, it seems like everything else is.
I noticed that someone else mentioned DMOZ editors. I’m kind of glad that in Google’s case, results are largely down to the use of algorithms. At least they don’t have mood swings.
I got really laid into by an editor on my blog. Caught him out in my stats. Would love to see his face when he comes back and see’s that I have rumbled him in a big way.
This is ultimately a positive development; further refining the authoritativeness of (even nofollow) social media derived inbound links to a domain. Google are master’s at continually increasing the sophistication of their ranking signals. They sure do keep the target moving… The cream, though, is still invariably rising to the top with some consistency – I had the experience of spyware forcing Yahoo as my default search engine on reboot & I was amused by the huge gulf in relevance between the two engines still.
At a fundamental level, Agent Rank seems to be looking at the same metrics as Digg – where most of the power ends up being in the hands of a few Diggers.
Of course there plenty of other factors Google can throw into the mix. How well a user’s +1’s correlate to their “trusted” index for example. Definitely a field open for research.
Figuring out what gets you that “trusted” authorship status in the SERPS will be interesting! Is it simply by being the main author of a “trusted” site, a “trusted” +1’er by your actions – or by virtue of both?
– If so to what proportion in each category?
Question: Why Google doesn’t rank my blog higher than other secondary sites when I’m really the one who’s updating it? I mean with the keywords I’m using.
Great post. It totally makes sense that google reach out for this kind of profiling and gives more credit to trusted agents. Might be quite hard to set up but we are getting there.
I am curious to know how to be considered a trusted agent as well and what would the threshold would be…..Anyway, seems you are the one to find out before anyone else…so….
first of all i want to tell you that i am completely thrilled with the thorough research that you do on your blog, and now i am about to write again (inspired by your blog) on agent rank and it’s meaning for the SEO community.
i think that if agent trust rank do exist, google may get it’s conclusions based on two methods that it already has :
1. our good old friend – TRUSTRANK, only this time the seed list will be of agents instead of websites.
according to different criteria it will be easy enough to create a seed list of agents for each term and area (such as : SEO, sports, marketing etc.)
2. metaweb – a semantic database google bought last year. now combining all of the relationships between entities with the activity in google plus it gets more and more possible.
if this is so – SEO’s should start groeing their own high quality google plus personas, only these ones will be much harder to fake than the facebook ones.
either way this makes social activity in the center of all things.
i also suggest that video content becomes a part of a healthy site ever since googles attention on the social side of search gone up.
don’t know what about you but i got my new JVC full hd video camera set and ready to shoot 🙂
Very interesting concept. I wonder if or when the idea of author will be common in search queries on google. I am launching a niche social network that focuses on some of the same ideas. The goal is to align the interests of all stakeholders that participate in user generated content.
I had read your article about authorship markup but I did not realize at that time the importance that Google could be granted. Adding rel Athor on all of my articles is now part of my todo list:)
Great informative post 🙂
i have a question – is it possible for a domain name to get pagerank 4 even without any website just domain name?
when i checked i was surprised, how this Google works really strange 🙁
@umang : you can find domain name with PR even if there is no website associated, for example if a PR5 website made a mistake and create a link to a domain that doesn’t exist (or doesn’t exist anymore).
Hi Eric (evolvor)
Thanks. I think you’ve given us a good starting point.
It’s not just a matter of giving a +1 to things that other people might also be endorsing, but also how Google might algorithmically value the posts and responses that you make at Google Plus, and in other places that might be connected to your Google Account (your website, your twitter and Flickr account if you linked to them from your profile, and elsewhere). What kinds of topics do you write about, what is the quality of those posts based upon some algorithms Google might use to rank by giving them a quality score, and how meaningful the responses you make to other people are, also based upon a quality score algorithm.
Other people may give a +1 to something that you’ve written, or share it, or tweet or retweet it, or post it to Facebook or a number of other social sites. Those types of endorsements may also impact a reputation score for what you’ve created.
It does seem from the patent that once you reach a certain point in what you post, in how you interact with others, and in how others interact with you and what you’ve created, you do reach a level where Google does trust you, and your endorsements of content from other can impact their reputation scores as well.
I am just reminded of this phenomenon: http://www.seomoz.org/blog/top-100-digg-users-control-56-of-diggs-homepage-content
I don’t know that it is such a bad thing, but after that article was published thousands of guides where created to aid people in their quest to become a top 100 user on Digg.
Reddit doesn’t really have that issue. Sure ‘i_rape_cats’ has a lot of social sway, but he only gets 1 vote like the rest of us. The difference is there isn’t a good way to track Reddit karma in a leaderboard view. This is a good thing. I sincerely hope that trust metrics are kept in the same fashion, and there doesn’t emerge a way to get a leaderboard of +1’s or ‘In circles’ metrics.
Thanks. How did I find this patent at the USTPO? I usually perform searches for patents like this on a regular basis, and have been for the last 6-7 years. 🙂
I think Google’s been building something like this before the people at Klout even conceived of the idea. The thing is, there’s little to no value for Google to share or display the reputation or user rank scores that they come up with specific users.
Klout reminds me a little of a site that used to treat blogs as if they were on the stock market, and value those blogs based upon things like links to them, and mentions of them. It was a fun site, but it actually provided little actual value to anyone who used it.
It is exciting to see how Google might incorporate Social Signals into how they might display and rank web pages and other web content. Social shares are a part of it, but only just a part.
I understand and appreciate your concerns.
A big part of the method that Google appears to be using doesn’t just rely upon who links to whom, who endorses what, who +1s what, and so on, but also involves a user rank based upon the quality of content created, and the meaningfulness of responses to other content. Google is ranking reputation on the quality of content, and the quality of responses to others, rather than just relying upon endorsements. See my link to “How Google Might Rank User Generated Web Content in Google + and Other Social Networks” from the post.
If you want to develop a high reputation, you need to actually create something of value. If you don’t do that, regardless of how many endorsements you have, and from where, it’s not going to make much of a difference.
The Web changes, it grows, it evolves. Searchers expectations change as well – they want answers to very specific situational and informational needs, and often they want information about very recent events almost immediately. It makes sense for Google to try to come up with algorithmic approaches, because there really isn’t an alternative if they want to try to keep up.
Sounds very reasonable. Interact with others in a useful and meaningful way on social networks and contribute things of value on topics that you are interested in, set up authorship markup to connect the things you author and create at places like your website and blog, your twitter and flickr and YouTube accounts, and so on, and you’re likely on the right path towards being trusted by Google.
I think the quality of content that you post is definitely one part of the equation. Since part of this involves signals from social networks, the quality of your responses to others, and the meaningfulness of your interactions also seems to play a role. This patent also points to how endorsements like a +1 or a response might also have an impact, but those seem like they would happen anyway if your contributions and interactions with others are useful and helpful and interesting and original.
Even if the impact of authorship markup and user ranks is only upon the display of results, such as the addition of an author profile and some information about others who may have shared or plussed a result, and so on, that may impact CTR. We really don’t know whether Google is ranking some results higher based upon these social signals. What we do know is that if Google has two choices of content to display that are identical or near duplicates , and one of them is associated with a Google Account and a “trusted agent,” they may show a preference for the content that they know more about with an author profile attached to it.
There’s no harm and much benefit in starting with publishing quality content. I think that is a very good starting point.
I hadn’t heard of the Crowdsensus Algorithm either, so it was really interesting to find it in that patent. Chances are that Google is also looking at other signals as well, some of which I wrote about in: GPS to Correct Google Maps and Driving Directions as a Local Search Ranking Factor?, in which I wrote about a Google patent filing with the title Trusted Maps: Updating Map Locations Using Trust-Based Social Graphs.
Google definitely is broadening the signals that they look at to determine the rankings of pages by also looking a social signals, and by associating digital signatures with content so that they can potentially filter out spam and duplicate content.
While an approach like this may potentially be open to abuse, part of its aim is also to substantially increase the amount of effort and time it might take for someone to manipulate user rankings/reputation scores, and search results. If you have to spend all your time creating believable social profiles and content and meaningful interactions to associate with those, you won’t be able to do it on the same scale that you might have in creating linkspam in the past, or fake profiles.
I have a hard time using Yahoo/Bing for my searching needs. I do try to do some searches at either site on a regular basis, but I’m usually more satisified with Google’s results.
Those social media links may not pass along any PageRank, but that’s really pretty much irrelevant as you note. Google is getting more sophisticated and looking at social media from a different perspective, to find more recency-sensitive content and to find information about authors on social sites from authors who they know are the creators of that content.
Agent Rank is pretty much the process of associating content with digitial signatures. It’s not the only element involved in how Google might rank content created by those authors, and things like endorsements and shares aren’t the only aspect of building a reputation. If you want to build your reputation score, you don’t need many people plussing your content or sharing it. Create great content on your blog, at Google Plus, at Twitter, at Flickr, and so on, get involved in meaningful discussions with others on specific topics and write something original and unique and relevant that broadens those topics in useful ways, or adds needed specificity.
Digg was pretty one dimensional, but Google looks at much more than how many times someone votes something up. Read through the post “How Google Might Rank User Generated Web Content in Google + and Other Social Networks” that I linked to above, which has been developed independently of Agent Rank, but seems to easily integrate with it to help in the creation of user rankings that rely upon the quality of your content, plus endorsements, and acts as a way to filter out spam, duplicate content, fake profiles and sockpuppets, and so on.
I don’t know.
Google looks at over 200 different ranking signals when it ranks pages in search results. Those can depend upon the relevancy of those pages, the links to those pages, how search friendly your site might be, quality scores that might be associated with your pages, how well optimized your competitors sites might be, how competitive the keywords you are targeting might be, and more.
Thanks. I’m keeping my eyes open, and if I find out more, I will share it.
There is a benefit to putting questions like that out there though, and having others share their thoughts and opinions and any other information or resouces that they might have come up with.
I’ll look forward to seeing your post.
Keep in mind that Google’s trust rank is very different than the Trustrank that Yahoo came up with. Google’s trust rank is based upon annotations and labels from sources like Google’s custom search engines, and possibly other places such as search wiki, Blogger, and perhaps even social bookmarking and social networking sites. It has nothing to do with links and seed sets of sites. See: Google Trust Rank Patent Granted
Under an AgentRank/User Rank approach, the “trust agents” referred to in the claims of this version of Agent Rank could potentially be very similar to the trusted sites identified in the Yahoo version of trustrank. Definitely worth exploring.
The focus of MetaWeb was definitely upon identifying entities and upon associating attributes with those entities.
SEOs should definitely be working towards creating their own high quality and highy trusted personas, and working towards teaching their clients how to do the same.
Video definitely has a potential role to play in all of this. The original Google post that introduced authorship markup back in June mentioned that Google had automatically integrated authorship markup into all content at YouTube.
Not quite sure of the point you’re making, but I do sometimes do searches in Google for authors’ names. Perhaps something like a special search operator for Google that searchers could use like “author:first name, last name” where you could find everything that’s been written by a specific person?
There’s a lot of potential behind what Google may do with all of this, and a good part of it is in associating your content with your profile. Hope adding the markup goes easily for you.
Thanks for your rigor on this topic and all things : ) Do you think there will be any tools for detecting and merging different versions of profiles? At this juncture I understand even my own identity is at odds with multiple Google accounts, a company Facebook page and a wayward community page, multiple twitter accounts, etc. I go by Chris Reilly, but my legal first name is Christopher… clearly each of these scenarios presents a computing challenge in the form of canonicalization issues. Do you see in the crystal ball that tools may emerge to help create clarity for engines about Author/Agent profiles? Or are we left to obsess over name presentation like we do with NAP data in local seo?
I think there are enough differences between how Google is measuring reputation and how Digg did so that a problem like that will probably not surface when it comes to how social signals might influence search results.
The more Google relies on popularity based signals like links pointing to pages, or clicks on search results, the more they are like that Digg problem. Broadening the signals used to do things like calculating aspects of reputation scores, which might influence those rankings, based upon much more than endorsements (+1s,shares, numbers of followers, numbers of responses) such as the quality of content created on social sites and the meaningfulness of responses also moves away from the limitations of Digg.
When Google acquired Metaweb last year, they went a long way towards gaining some expertise in identifying canonical and related profiles and information for specific entities.
But there are some simple things you can do that might help, like linking to your twitter or Facebook profile from your Google profile. Google suggests linking to other sites like this on their help page about links from a person’s Google Profile:
I am a chiropractor in St George UT. I try to educate my patients face to face as well as on my webpage. I don’t use the webpage just to hook patients, rather I try to answer questions and do some instruction along the way. It seems that no matter how hard I try, I can’t seem to get onto page one of google. To my SEO-untrained eyes, the site looks good. But I am still missing that magic something to tip the googlati scales.
Hi Dr. White,
You have a lot of competition if you are trying to compete with sites like Webmd and the Mayo clinic, along with every other chiropractor in the world for health terms related to what you offer.
Your best bet, and a much easier task, is to focus on trying to rank for less competitive geographically related terms first, such as [St. George chiropractor] or [Utah Chiropractor], which is probably a good idea since many of the people interested in your services are likely to be searching using those types of geographically related terms.
I imagine that it took a few years to get good enough at chiropractic care before you started offering services to clients. It can take a while to get good at SEO like that, as well.
Bill, we have recently added Google+ buttons on our site and blog. It makes sense that google wants to be the authority on the value of sites. I think this is just googles way of competing with FB’s like buttons but your post is making me rethink this.
I had read recently that Google don’t care about who we are or what we do, I mean as bloggers etc.
They have one interest and that is that the information we provide is of use to the people who use their search facility.
Seems this is coming true more than ever!
Hi Bill, I believe there is some hidden formula for Google trust somewhere along the line. It’s a relief to know that it isn’t just our own website that is still having to wait for the likes of authorship markup appearing in the SERPs – check out the below for example:-
If authorship markup was meant to be integrated into blogspot blogs, then there sure is a lot of them not working as they should too.
As we briefly discussed before, I’m sure somewhere along the line Google Plus will have something to do with this…
I can see a person fixing a Google Places error for their own business, but for someone else’s? Doubtful.
As Michael M says, there could potentially be a small number of people who get away with “anything” so to speak.
But overall , this is a step in the right direction to reduce webspam.
Adding the +1 buttons is probably a good start. With Google Plus, they do seem to be looking at more than those buttons, but you do have to start somewhere. 🙂
I’ve been blogging for more than a decade now at one site or another, and have been fortunate to get visitors to the sites I’ve blogged at through Google. As a search engine, their primary interest is in getting searchers to the information the searchers are looking for. If you are writing things that might interest the people who are trying to find what you offer on your pages, using language that they might use to search for it, that can help. In many ways, most site owners’ relationship with Google is fairly symbiotic.
I’m not sure that I’ve seen a single blogspot blog showing authorship markup, even though the Google blog told us that they’ve implemented authorship markup for all sites on Blogger. It’s possible that Google might be holding off on showing authorship profiles for people using Blogger since Google is also doing things like letting people replace their Blogger profile with their Google Plus profile.
I agree that there’s likely some hidden formula before it starts showing, that could possibly involve more than how many circles someone might be in.
I’ve done things like report that some places showing up in Google Local are closed, and that others aren’t at the locations that are listed in Google Maps, and I didn’t own those sites. Unfortunately, there still are many places in Google Maps that haven’t been verified by their owners.
Hopefully, if someone has verified their site and business location, Google will contact them before changing location information for them.
It is good to see that Google is working on a few different ways to try to verify map edits and changes.
Thanks for your reply Bill. It’s good to know some people still go out of their way to contribute in this type of way. You’re right, so many business owners have yet to claim their Places page, amazing….
Yes, that would make sense. I think they could even be holding off on showing authorship profiles for other websites too (for whatever reason) as I’ve definitely not seen an increase of these appearing in the SERPs since they were released – which is what I would have expected…
This does sound a lot like google’s answer to Facebook’s like button. Will +1’s impact pagerank?
@Mike: I feel that undoubtedly Google +1’s will have an influence on organic rankings along with other social signals such as Facebook Like’s and Twitter ReTweets on other search engines as well as Google. With Google still dominating the majority of search engine market share, the introduction of the Google Plus social platform along with +1’s on listings in their web index is the strongest chance they have of competing in the social sphere due to so many relying on traffic via their search engine. It will likely be their ultimate goal to utilise this platform to encourage more users of their social signal metrics and social platform – it would be foolish not to.
Thanks. Just checked Google’s My Maps last night, and they actually “denied” my reports that a couple of local places had closed. Wish there was a way to make it easier to verify that those places were actually closed, but both have been closed for more than a couple of months now.
Would love to see more people claiming their businesses in Google Places. Google adds places on their own, but there’s no guarantee that the information they put in those is correct and up to date.
Hi Geoff ad Mike,
Google just started showing information in Google Webmaster Tools about authorship, but I’m wondering how to actually interpret what they are displaying. Interesting nonetheless.
It does look like the act of adding a +1 will influence the personalized results that people see, but we don’t know at this point if Google will look at that type of information in the aggregate and use it to affect rankings that everyone sees. There’s a possibility that Google will, but I suspect that even at this point they are doing a lot of testing and experimenting to see what might actually work best and provide the best experience for searchers.
Bill – The verification system could be improved for sure. I can see unscrupulous business owners claiming a competitor’s page with their own phone number as an underhanded way to scrape off a couple leads.
It’s unfortunate they denied your request, but again they need more concrete proof it seems, since you could also be a competitor trying to interfere… 🙂
Please excuse my use of the word “verify” when it comes to making map edits at Google Places. I’m not really talking about the Google Maps verification process, but rather the editing process when it comes to updating Google Maps/Local. The only way for Google to get it right most of the time is to have people from Google show up in person and check on businesses, but it’s also a very costly approach. Then again, Google has spared little expense in sending streetview cars out to film roads across the world.
I wasn’t making a verification request, but rather just reporting that some local businesses were closed. I know what it’s like to rely upon Google Maps to try to find local places, and follow Google’s directions to get to a business they have listed only to find that the business isn’t at the address that they’ve listed. That’s not good for people who rely upon Google Maps, and it’s not good for local businesses.
I’d be happy to provide more proof that one of the local bakeries and one of the local restaurants that I indicated were closed, if needed. Of course, Google also has new businesses at the same address listed now as well, and that should be some kind of indication that the old places might just be closed. But usually people who close businesses don’t have much incentive to remove their businesses from Google Maps.
Google should be careful when indicating that businesses are closed, of course. If possible, they could try to use the contact information provided by those businesses to see if they answer the phone, or get a response to an email. If the phone is disconnected, and the email is returned with a “no such address” message, that could be a hint that something’s up.
Good call, thanks for your clarification on this Bill.
You’re welcome, Jeff.
When it comes to Maps, if Google really wants to get more people to do things like verify their listings, and use advertising on Google Maps, they probably need to improve the quality of their listings, but it seems like such a potentially costly and time consuming undertaking. I understand how trying to do some crowdsourcing is really tempting and could be helpful, but even the verification issues you pointed out can be a potential problem.
I think users of a service like Google Maps have greater expectations than people did of the old printed Yellow Pages back in the day, in terms of timely and accurate information. I can’t even begin to imagine how much that might cost, though.
This is a great article on why crowdsourcing data for maps accuracy is difficult to trust: http://blumenthals.com/blog/2011/08/15/places-blackhat-playground-reported-to-be-closed/
One of the things I think Mike’s post illustrates is how important it is to check on your Google’s places listing on a regular basis.
The Yellow flag asking about whether or not a place is really closed is much better than something that says it definitely is. It’s a check on Google’s part to try to get others to verify whether a report on the closing of a place is true or not, but it might cause some possible visitors to a place to think that a place reported as closed actually is. I wonder if it would help for Google to say that a little differently and if it would help, such as:
“A Google user has reported this business as closed. Not true, click HERE.”
It makes sense for Google to try to come up with algorithmic approaches, because there really isnâ€™t an alternative if they want to try to keep up.
I think Agent Rank, and related methods, are only going to grow in importance. A shift to rating based on author/contributor profiles has the potential to be as big a game changer as when links first became a metric. I think the launch of Google profiles (2009?) was probably the first big sign I can think of that this was on the way, and the timing seems right now with all the issues of recent years around thin/poor content.
In the short/medium term I think that this is all going to lead to some appalling sycophancy though as people try to win favour from ‘trusted agents’.
I agree. Any approach that Google might adopt that relies too much upon human review and judgment likely won’t scale well given the vast number of pages on the web and the amount of data available to base those judgments upon.
I agree about the growth of the use of Agent Rank. In hindsight, the launch of Google Profiles does seem to be a step along the way to the growing Google Plus platform. Google has definitely experimented with social networks, with services like Orkut, and with the Open Social API that they launched, but it hasn’t been until Google Plus where we’ve seen them start integrating aspects of their social networks into Google Web search results, which I think is a sign of how much importance they place upon it.
Hopefully people will realize that it’s more important to do what they can to become a trusted agent themselves than to try to win favor from others who are. 🙂
It is constantly amazing to me how you guys do what you do. Keeping up with all this information and staying ahead of the curve must be a daunting task. I’ve been a social media expert and executive producer for a long time and the “keeping up with the info” portion of the social media side was always a barrier to entry for me. It’s because of guys like you I choose to stay in it. Thank God someone writes about this stuff honestly.
I too have been trying to keep up with the “Trusted Agent” element and it is a daunting. I’ll make sure to keep up with you.
My question is this am I doing harm to my “reputation” as we speak? Google has always been great at moving the target but when the target can’t be seen I get nervous. I try to stay completely white hat I just don’t want to do something inadvertently that puts me out of business. Thoughts?
It is hard to keep up with all of the changes, not only at the search engines, but also on the Web itself, including new technologies, new types of sites, and more. But it’s worth the effort to try to keep up and abreast of those changes, and even to find ways to learn about things that might change but haven’t yet, which is part of the reason why I spend a lot of time with patent filings and whitepapers from the search engines as well.
I’m not quite sure what you’re so concerned about,from your comment. If you follow good marketing practices and don’t do things that the search engines might consider manipulative or spammy, you probably shouldn’t run into too many problems.
It’s that “shouldn’t” that scares me. I try to follow all the rules but do the best I can for my clients. My concern is always pushing the envelop for them but staying completely within the rules. I’ve learned a ton by reading these post and hopefully I can keep up like you.
I guess what we’re saying here is that a “trusted agent” will more than likely be someone with constant good content and also someone who receives a lot of social attention from other trusted sites? I’ve been trying to figure out the trust factor thing out for quite some time. Often times I see sites rank highly with little content to offer and low moz scores. They can hit position 2 or 3 with nothing really going for them other than the url as the keyphrase. Could this type of site become a trusted agent or would actual rankings have nothing to do with influencing another domain? Ranking should go hand in hand with trust shouldn’t it?
Google does it again isn’t amazing how easy they manage to get people to grow and promote their networks with just some simple link bait and a few new tools. There are already thousands of posts about setting up your google plus profile and linking it to your content and your content to your google plus profile.
All it takes is a few press releases a couple of Google employees to post on their blogs and whoa and beyond thousands of web masters run off to begin implementing and building their web properties.
It is no big secret Google has attempted several times to try and gain the marktet share back from Facebook for social media maybe this time they have cracked the code.
Time will see it will be interesting to see if this one holds traction and takes off or goes into the dumpster like earlier attempts such as Google Buzz…
It’s the “should” that should scare you. It scares me, too. That’s part of the reason why I bury myself in patents and whitepapers from the search engines, to get an idea of what might be going on in their heads.
Trust may end up playing a role in search rankings outside of personalization, but it’s possible that Google might focus upon using it in personalized search more at this point until there’s a wider adoption of things like authorship markup.
The change to the agent rank patent involves people who do have Google Accounts, and their path to becoming a “trusted agent” likely has something to do with at least their actions on Google Plus and possibly upon sites where they have claimed authorship. It’s possible that could spread to interactions in other places on the web as well in the future, but for now Google has access to information about data involving their interactions on Google Plus that it doesn’t have and likely can’t acquire from services like Twitter and Facebook.
I suspect that Google is learning from their mistakes.
I work as a real estate broker and a DIY blogger and webmaster, and this seems to be a bit nerve wracking for me. I know competitors that will farm out their content creation and link building; as will as the big companies doing that.
Right now the top Vancouver WA placements are being taken over by the national websites that are not creating content based on knowledge gained by living and working in this local area but rather, by gathering statistics available on-line and regurgitating them automatically in mass quantities. These big companies than solicit to us local brokers asking to pay them big monthly fees for a portion of the leads they generate for the local market.
Is what Google is doing going to provide more authority to the local person generating good local content over the massive nationals that are leveraged by technology and funding? I do hope they’ll gives a fighting chance at this!
Thanks for your reporting on this!
Thank you for another great post. I wanted to get your take on the big “Search Plus Your World” update that Google just rolled out.
Over the last few years, the relationship between Search and Social Media has been growing and this update effectively unites these forces on Googleâ€™s SERPs. From what I’ve seen, â€œSearch Plus Your Worldâ€ is both a display and algorithm change that will give searchers a much more personalized experience when signed in to their Google accounts (Gmail, Google+). The search results are now much more â€œsocialâ€ and will include the typical public results from before as well as privately shared content from your Google+ friends. Searchers can even toggle between standard and â€œpersonalâ€ search results. So now, standard SEO signals, your previous search history as well as your social connections will all affect your search results.
Many industry experts are expressing privacy concerns as well as antitrust concerns. Google seems to be using their dominance of the search industry to try to push their Google+ social network past Facebook and Twitter. I even saw an article on SearchEngineLand that cited that the FTC is expanding its anti-trust investigation of Google to include Google+. So this begs the question, not are we trusted agents but is Google to be trusted?
What further impact do you expect this latest change to have on the search industry?
It’s possible that the authorship markup and Agent Rank approaches that Google is providing will work towards identifying people who write about and interact with others on a specific topic (including a geographic region) that they are authoritative on the topic.
We are seeing some of the impact of this on logged in searches, in a personalized search manner. So if you are socially connected with people on Google Plus at this point, and possibly on other social networks in the future, and you are considered an “authority” on things that happen in the Vancouver WA area, stuff that you write relevant to their queries that might involve that location may mean that pages that you’ve written or shared or plussed stand a decent chance of showing up in search results.
Chances are that it would be easier for you to be seen as an authority in a specific area than some national site that would be spread too thin to try to focus upon so many different regions.
Will this kind of authority influence logged out results on Google at some point in the future? It’s possible, and I expect that more than a couple of people at Google are exploring how they can use that kind of information.
For people in the kind of situation that you are in, I often suggest that you focus upon smaller details and long tail queries about your area that a national company can’t afford to cover in such detail. Write about local schools, local events, local neighborhoods. Be a guide to people who are considering moving to your neck of the woods. Help people learn about things that they will search for, but the national companies won’t write about.
Thank you. Nice question. 🙂
I addressed a number of the issues around your question in a recent interview here:
At this point, it truly makes sense for Google to give preferential treatment to data that they have much more control over – their Google Plus data. It’s much easier for them to access that data quickly, check IP addresses to see where it was posted from and if the person or people posting it might have hundreds of false profiles that they might be using to try to +1 everything they can up. Facebook has been limiting the amount of data that they want to share with Google for years, and simultaneously wants Google to feature it side-by-side with Google Plus posts? How does that work?
Likewise, Google was happy to feature tweets in its social search, but the agreement between Google and Twitter ended, and Google no longer has access to new tweets in the almost real time manner that they did. Google still indexes tweets, but they no longer feature them. Do they have as much value, and are they worth showing when they get old?
Those tweets were definitely interesting and useful, especially when you had a query that involved something that happened very recently, like breaking news or a natural disaster. I don’t know what the state of negotiations might be between Google and twitter regarding access to those tweets, but if Google could be shown to have negotiated in good faith for access to it, I don’t know that Twitter really has much of a leg to stand on.
I’d love to see Twitter beef up its search engine. It needs some serious work. I’d also love to see Facebook make their search more prominent, and improve how search on the site works. I wouldn’t mind seeing either in Google’s social search results, and I suspect Google wouldn’t mind it either.
Great insights, Bill. Nice interview, thanks for sharing.
I agree that Google wouldn’t mind seeing Facebook profiles and posts in their results, if they began sharing their data. It certainly seems like it would benefit both parties. Google appears to be dangling this carrot in front of Facebook to gain access into their “private garden”.
However, I’ve seen some significant backlash against Google’s Search Plus Your World. Most people that are against it say that they don’t want their search results “tainted” with social fluff and I tend to agree. While Google SPYW may seem interesting at first, I think this is a sugar high that may go away quickly for many power users. Like many other searchers, I don’t go to Google for social sharing … I use Facebook for that. I understand that Google may want me to turn to Google+ for my social needs, but I use Google to get the most relevant results and SPYW gets in the way of that.
I’ve read a few articles recently that mention the discontent among Googlers and the general consensus that Google is straying to far from their comfort zone. I doubt they would be doing this if Bing wasn’t so far behind …
I am all for innovation, but Google’s latest rollout seems to go against their doctrine of providing the best and the most relevant search experience possible. I agree with the stance stated by a few industry experts: Google is running the risk of isolating their core audience by diluting their search product.
Thanks again for the great discussion points, Bill, and keep sharing your wisdom.
Thank you. There’s also an interview that came out today with Google’s Amit Singhal on Google’s new social search features:
Two Weeks In, Google Says â€œSearch Plus Your Worldâ€ Going Well, Critics Should Give It Time
Given the poor state of the search engines at both Facebook and Twitter, it might not be a bad idea for them to share more with Google, especially given their complaints about not being included in Google’s social results. But is it a great idea for them to do that from a business perspective? I’m not sure.
As Amit Singhal says in the interview:
If Google just grabbed content from Twitter and Facebook, and tried to insert it into social search results without doing the kind of data analysis that involves doing things like filtering out fake profiles and sock puppets, it’s possible that those social results could become quickly filled with some serious spam.
And if Google decides to include content from other social networks, it might not be a bad idea for Google to add some user controls to social search, so that people using the service can toggle off viewing results from Google Plus, Twitter, Facebook, YouTube, Myspace, etc…
thank you for your reply. I’ll need to get better plugged into G+ to see if that will have some positive action.
I’ve also made good progress over the last year with more local, granular search and information pages. Even then, it is concerning how far down the Big Z and others pursue the search terms!
You’re welcome. Your advantage is still that you are located in the area you’re writing about. You get copies of newspapers and magazines that might not be published on line that contain news that you can share. You can carry around a point and shoot camera in your pocket and take pictures of places and events that the big companies can’t afford to send people to cover. The more that you, and others who are similarly situated in different locations can do to cover things that the big companies can’t, the better off you all are.
Google users other referrers to cross check the references submitted by the end users? Anyway, what I like about Google is their focus on associating content with their original creators which is quite crucial in present day scenario of spams and auto generated content.
I am a newbie and this write up already made me reputation-conscious :/ I am gona apply for authorship soon.Any tip u wud want to give?
Google does try to use verifications in Google Maps, and in Google Plus they likely pay attention to where people might post from, when they might post, and look for patterns in that kind of data that might indicate how much weight and authenticity something posted in there should be give.
Associating content with original creators does seem to make a lot of sense, and potentially helps solve a number of problems, like determining what content to display in search results when there’s more than one copy on the Web on different sites.
If you follow the steps that Google supplies, it shouldn’t be too difficult to set up authorship for your content. But it’s possible that they might look at your content, your contributions on Google Plus, how meaningful your interactions with others are there, and how many connections you have (people who have circled you) before your image might start showing up next to your content in search results.
Great interview. I hadn’t even heard of Agent Rank before. I’ll have to do some more research so I can understand that part of their Algorithm a bit better.
Thank you. Given how easy it might be for digital content to spread out across the Web without attribution to the original creators of that content, it makes sense for Google to explore options like Agent Rank and digital signatures associated with the content. I think it’s helpful for people who work on the Web and create content for it to know about how things like Agent Rank might play a much stronger role in how search engines work.
Thanks for getting back to me Bill. I agree that some sort of digital signature or Agent Rank would be awesome to help lots of people keep the attribution that so often seems to get left behind currently.
You’re welcome. With some recent news that Google will also come up with a comment system to be used on sites, that are also tied to Google Accounts, we’re starting to see most of the aspects of Agent Rank being put into place by Google. If they add the “syndication” markup they are presently using in Google News to other types of sites, I think that attribution will be even more helpful to Google’s indexing of the Web.
Speaking of being on Google’s good side. Have any of you ever received one of the GOOD memos from Google? Its pretty interesting here’s what I have seen:
I’m looking to speak with the marketing/business development team at __________________ (Removed for privacy).
I work on a team here at Google that partners with higher-potential B2B clients (many of them specializing in ___________) to help them grow their presence online and acquire new customers across Google (Google.com, YouTube, Google Display/Mobile/etc.).
We liked your site, saw you’d opened an account, and would be interested in scheduling a call with you to learn more about your digital/social strategy and to see if we can help.
Please let us know who the right contact might be (if that’s not you) and when would be best.
The more Google can refine the “trusted agent” part of their algorithm the better its search results will be. Crowdsourced collaboration results from experts in their given areas will provide amazing search results, but Google needs to be able to incorporate more than just Google+, they have to be able to incorporate other social signals: facebook, twitter, quora, etc. Maybe the easiest way is working out a deal with Klout?
I can’t say that I have received an email from Google like that in the past. Definitely very interesting to see the memo you’ve received. Thanks for sharing it with us. 🙂
For Google to effectively work with other social sources, they need access to collateral information about the content from those sources, such as the IP addresses that certain posts were posted from, a timestamp related to them, and so on. That’s the kind of information that would make it more likely that Google could screen out fake profiles and sock puppets, and not include them in its reputation scoring. Without that access, it’s better off if they don’t include that type of information.
I would definitely be wary of including Klout information as well. According to Klout, I’m knowledgeable about surfing and Chine. I may surf the Web, and eat off China, but that’s the closest I get to being knowledgeable about either. I may have post a tweet or two that use the words “surf” and “China,” but if their algorithm expanded upon those mentions by me to impute an expertise at any level, they need to work upon that algorithm some more.
Yes, I agree, to be effective Google would need very deep access to these other social sources. In the future these type of partnerships may be able to be setup… Until then all we get is Google+ integration unfortunately…
It is possible that the relationships between Google and other social services that might want to be included within Google’s social search results might change, or that Google might find other ways to attempt to uncover fake profiles and so on. Google’s social search is still relatively pretty much in its infancy. Our options seem to be to wait and see at this point.
Hi Bill. First off, thank you for covering this topic. I actually came across your post searching for “website Google reputation score” which might be very useful at this time for me. We need to do a bit of de-linking for a website which might be compromised by links from the low reputation websites. Is there any tool out there to analyze that?
Comments are closed.