Guest Post by Matt Cutts on Spammy Link Building [Satire]
This is a guest post by Matt Cutts, Google’s “head of spam” by day and well known SEO blogger by night.
I added some text decoration/formatting here and there to stress the main message and to improve readability.
In my recent article on the decay of guest blogging for SEO I raised some of the most common issues regarding low quality albeit spammy link building with guest articles.
Speaking out to the community
This article raised some eyebrows to say the least. To make it unmistakably clear that guest blogging is not bad by definition I decided to speak out to the webmaster community by way of a guest post myself.
Why did I choose SEO 2.0? This blog has by now a reputation for being critical of Google.
I wanted to make a stand and to show that Google is not afraid to face its most avid critics. In the coming weeks I will publish more guest posts over at
- Aaron Wall’s SEO Book
- Jacob King
- Agent Black Hat
and other publications known for their critical stance on Google policies.
Let’s start with the basics now: guest blogging is not spam per se. Only low quality guest posting solely or mainly for SEO purposes is spammy.
How to spot spammy guest postings
How do we know whether a specific guest article is to be considered low quality? We look at some specific issues.
I can’t disclose how the Google algorithm deals with it exactly but I’d like to show you on an example from my own blog what red flags a guest post can raise.
This is a guest article that my former colleague Vanessa Fox has written on the Matt Cutts blog back in 2006.
I know what you think: “Matt Cutts was into guest blogging way before it was cool”.
Showing off is not what I’m after here. It’s to point out that even the best of us can make mistakes. Many people at Google are not knowledgeable about SEO at all.
I am, it’s my job after all, but as the Google Webmaster Guidelines change frequently even I have sometimes difficulties to keep the pace. So what did I do wrong back then those many years ago?
Outgoing links to one domain
Low quality guest posts in most cases have too many links leading to the same domain or company.
In my case I have allowed Vanessa to add five outgoing links to this article. They all either lead to Google properties or to Pubcon where she was speaking.
She added three links to Google owned sites including our blog at Blogspot and two to the conference.
No other relevant links were added. Spammers in most cases just link to themselves. That’s a common tactic.
Rich anchor text links
Guest blogging turns really spammy when guest authors add rich anchor text to the outgoing links to their own properties. Vanessa did it with all links to Google in her article on my blog.
She linked to our sitemaps page with the “sitemaps” anchor text.
She also linked out our blog about sitemaps with the “sitemaps blog” anchor text. Last but not least she used “site review tool” for the last link to our webmaster tools.
No “nofollow” attribute added
John Muller has already told the webmaster community a few months ago that ideally all links you build for example using
- press releases
should use the “nofollow” attribute. You need to tell Google and other search engines that it’s not a natural link from the editor of the respective publication.
Likewise guest posts need to add the nofollow attribute to outgoing links in order not to manipulate PageRank.
We introduced nofollow with Yahoo and Bing (MSN then) already in 2005 so I could have added it. How could I have known though what will happen almost a decade later?
Shallow thin content
Vanessa is only covering SEO basics in her guest post on my blog. Even in 2006 it was all common sense.
Thus there was no need to regurgitate it again. It wasn’t even a genuine attempt to cover the Pubcon session thoroughly.
She just reported the obvious basics every webmaster should have known already. She apparently only did it again to be able to link out to our resources and tools.
Let’s face it, the whole post was shallow thin content the Web could have lived without but as a guest post it was OK.
After all I covered a lot of cat content on my blog in those years so that finally there was at least a slightly relevant article.
What does it all mean?
It’s about cleaning up the Web. We have to remove spam one link at a time. Ideally webmasters would stop linking out completely.
Then we could rebuild our algorithm from scratch based on Google+ and authorship markup. Until then we have to clean up this cesspool link by link. Join us in the fight against spam.
- don’t accept guest posts from strangers
- don’t link out to strangers
- don’t talk to strangers
In case you do we at Google have the means to determine you do it and to act manually or algorithmically. Ever since we launched Google Books we know that hypertext is way overrated.
Books worked for hundreds of years without hyperlinks. So why can’t the Web?
Please follow me to get more insights on how to make the Internet safe from spam again. Use only Google approved ways to link other sites.
Am I the only one thinking this has to do with Google’s ever growing need to promote google+ in the first place?
[…] Since Matt can post updates, I figured I can do that […]
Thanks for the update, Matt, and thanks for publishing the guest post, Tad. I updated mine too :)
I guess this post just about explained in detail what to expect from guest posts. The example of the guest post submitted by Vanessa Fox way back in 2006 kind of clarified some issues on how to avoid spammy links in guest posts.
In summary, following the webmaster guidelines from Google on link building and guest posts is still the best approach in getting rid of spammy links and enforcing quality and acceptable guest posts.
In kingged.com, the content syndication and social bookmarketing website for Internet marketing,I have left the above comment where this post was shared.
Sunday – contributor for kingged.com
A work of pure genius. I’m making “don’t talk to strangers ” my motto for 2014
Accolade for Tadeusz!
Great post Matt. i ll definitely check out your blog!
“hypertext is way overrated”
He is kidding, right?
More detailed information for the guest bloggers to get alert.. :) It is weird that most of the people dont know about SEO in the google’s team.
[…] 2.0′s “Guest Post by Matt Cutts on Spammy Link Building” – Matt’s guest post attempts to make it […]
[…] Spammy Link Building […]
I will always link out to anybody I please, stranger or not. The Web does not belong to Google and there’s a lot of stuff in the Guidelines for Webmasters that is not even spam (unless one wants to see it through Google’s eyes).
Sorry Matt, but my answer is NO. Google’s phylosophy is not mine.
Will this guest post hurt Matt Cutts in the SERPS?
[…] Ironically, Matt Cutts decided to follow up on this topic… in a Guest Blog Post at SEO 2.0 which you can read here: Matt Cutts on Spammy Link Building. […]
Isn’t Google plus authorship a link scheme?
Isn’t this an attempt to make just about every blogger in the world unnaturally link to Google Plus?
Someone please send Google plus an unnatural inbound links warning for participating in links schemes.
I thought authorship was going to be a way to separate the crap posts from the more legitimate or higher quality ones.
Now that we can see that Google is going to penalize all guest posts unless they no follow links anyway isn’t authorship a mute point?
[…] I stumbled across a guest post that Matt had published over at SEO 2.o regarding spammy link building that I had to share. Also a very good […]
Does no one else here smell April Fools. if this is Matt Cutts, it’s drunk Cutts. Matt is a good writer; this writing is horrible.
‘SEO blogger’ as the anchor text. Is this guest blogging on powerful sites just to get mattcutts.com ranking for that search term?
So the key to a good guest post is to link to 6+ websites per post because a natural blog post has that many references… and within the 6+ links you get to throw a link or 2 to one of your clients and you give other people links just for the sake of it…this could make guest blogs even spammier because people are just linking to random sites because they have to.
It’s scary how I skimmed through this and thought it was legitimate… Even this part somehow didn’t register as odd (if coming from Matt Cutts):
“We have to remove spam one link at a time. Ideally webmasters would stop linking out completely. Then we could rebuild our algorithm from scratch based on Google+ and authorship markup. Until then we have to clean up this cesspool link by link. Join us in the fight against spam.”
This article is pretty confusing.
Matt denounces anchor text links to self serving properties but right in the first line there’s a followed link to his blog using rich anchor text. WTF?
“Ever since we launched Google Books we know that hypertext is way overrated. Books worked for hundreds of years without hyperlinks. So why can’t the Web?”
Huh? How would people get from one website to another and discover new websites? Oh, with Google? HA! Nice try Matt.
Just because books managed in the past doesn’t mean that links are bad. The ability to provide a direct path to other related pages is way better than the system of books.