In the middle of January I noticed several of the website I am connected with, started showing a large number of 404 errors, sometimes enough errors to trigger a warning email from Google’s Webmaster Central .
The real code from the page looks like this (actual domain redacted):
part of an SEO’s job is to remove as much guesswork from the process as possible and find solutions with predictable and dependable results…
While Google hasn’t officially said having a large number of 404 errors will lower your websites ranking, you can bet that if it’s important enough for Webmaster Central to send you a warning email about it, it’s not a plus for the quality of your site. As a publisher I was faced with three choices:
- Remove the link cloaking and click tracking and hope Google doesn’t care about the affiliate links
- Roll the dice and hope Google doesn’t count the large number of 404’s against me
In my opinion part of an SEO’s job is to remove as much guesswork from the process as possible and find solutions with predictable and dependable results. That’s why as a site owner, publisher, or online business it’s important you not use experimental or unpredictable technologies for anything along the mission critical path for your website.
- You need to be aware and monitor the number of errors your website is reporting
- Take a proactive role in maintaining the health of your website BEFORE it becomes to late
- Use bleeding edge technologies sparingly and never along a mission critical path or to serve up essential content
- Be vigilant to the fact that Google can and will change the rules of the game without warning or notice and in complete disregard for your website