In mid July of 2012, Google sent out a wave of “unnatural linking warnings” to many webmasters. They later said you could ignore these warnings. This was followed up by a Microsoft clippy style passive aggressive letter saying they were going to discount the “unnatural links” but that you should file a re-inclusion request anyway. The whole situation is insane and confusing, to say the least, especially for online marketers (in case you hadn’t realized this was a shot across the bow of internet marketers). Google is now easier to upset than a nitpicky bridezilla who doesn’t like her wedding cake, and you need to deal with it.
In my opinion, Google’s change in direction indicates a clear shift away from rewarding positive SEO signals…
For those of us who have been around the block, negative SEO has always existed, despite Google’s attempts to downplay it. Larger and extremely well trusted sites really are immune, but mid to low level sites are vulnerable, especially if their real, naturally-acquired backlink profile is weak or non existent.
How to Minimize Negative SEO
Negative SEO is a lot more art than science. Making someone else “look bad” requires looking at their individual situation and identifying weaknesses. However, there are some general tips you can take to minimize the amount of bad and low quality links your site will get, and stop handing things out on a silver platter.
Full Feeds versus Partial Feeds
In the past I’ve been a huge advocate of publishing full RSS feeds over partial ones. Despite RSS never living up to its promise, I truly believed that publishing full feeds was good for the users–advertising, social signals and page views be damned. However, the down side of publishing full feeds is that it makes it child’s play for scrapers to reuse your content without authorization or approval. It’s with a heavy heart and a bit of “damn you few bad kids for ruining it for everyone else” that I no longer recommend publishing full feeds.
RSS Advertising Links
A little known fact: I coded the links in my RSS feeds slightly differently than those at the end of each post. Why? So I could tell which way people were obtaining and using my posts. These RSS links were never strong and low quality at best, and Google usually ignored and discounted them. That said, I built up a significant amount of them. Due to Google’s new lower threshold for negative links creating problems, I’m also not recommending this technique.
Site Wide Low Quality Links
If you have a profile with Google’s Webmaster Central you should be on the lookout for sitewide or large numbers of links from sites that you aren’t affiliated with, are of questionable quality, and might be re-using your content. For example all of the sites listed in the left hand column below would fall into one or more of those categories.Links worth taking a second look
DMCA Takedown Notices
The ability to file DMCA takedown notice with Google or a web hosting company always existed but, to be honest, was a complete PIA and not worth your time. Unfortunately, that’s no longer the case. You don’t need to spend all of your time engaging in the pursuit of DMCA justice; just take care of the biggest offenders every 3-6 months.
So what are the takeaways from this post:
- Understand Google’s algorithm is changing and sometimes you will need to adapt to accommodate those changes
- Stop making theft of your content easy. Realize you can’t prevent it 100% but stop handing it to people on a silver platter
- Stop automatically manufacturing low quality links: they will probably do more harm than good
- DMCA violations: go after the biggest offenders every few months
- Do everything within your power to resist the temptation to avoid low quality links. Everyone gets them and it is unavoidable, but don’t encourage the practice