Ok let’s get the easy stuff out of the way, am I mad because I scored low or something like that … hardly this blog got a 98. Am I taking a swipe at a competing product that I or one of my friends is offering … nope. This is a bad product because it tricks people into thinking things are “OK” when in reality they aren’t (hey Jason Calacanis if you want to start talking smack about SEO being silicon snake oil this is where you should looking).
Let’s take a look at the top blog on their report engadget.com which scored a possible 98 out of 100. That’s a pretty good score so there really shouldn’t be too much wrong with that right … (leans back and cracks knuckles).
1) Engadget has a wildcard URL problem so I can link to them like this and it works http://wedontknowhowtoconfigureawebservercorrectly.engadget.com/2007/10/04/new-york-city-taxi-drivers-threaten-to-strike-again/
I wrote about this back in May and nobody cared then and nobody cares now.
2) try to pull up a bogus URL on engadget like this
Not only do you get a page that doesn’t tell you that you went to a bad URL, but it returns a 200 status code instead of a 404 see for yourself.
3) A machine is never going to be able to give you guidance on how to focus your internal anchor text, for example here’s a post from the homepage
They are using words like [chatter] and [street] as anchor text which really helps no one. Ok we’re all lazy and do it from time to time, but a program is never going to be able to distinguish between an occasional minor lack of focus on internal anchor text or a systemic wide failure. Don’t think stuff like this matters look at the New York Times.
4) According to Google’s TOS you aren’t supposed push search pages into the index but engadget has 900+ search pages indexed. Is it a horrible offense no, but it’s an easily correctible one.
5) Robots.txt file never had one never will, but hey at least they are serving up a 404 header code this time 🙂
Five things that could be fixed pretty easily and I only spent 15 minutes looking. Ok heading off a question at the pass now. Gee Gray if SEO matters so much how come Engadget has like a billion page views a day and buckets full of money? Well they have something called defensible traffic, built in subscribers who come to them everyday without using Google to do it. Now just think of how many more people and how much more traffic they could get if they did pay attention to SEO …
Back to Website Grader, here’s my problem in a nutshell, there are somethings in SEO that you can automate, there are some things that you can’t and a site audit is at the top of that list. You can use tools to help you gather that data, sift through it, sort, arrange it, and dissect it anyway you want. At the end of the day you need a an experienced analyst interpreting that data for you. You don’t want the SEO equivalent of a check engine light. There’s reason site clinics are so popular at search engine conferences, people need to expert advice on how to fix things from time to time.
So let’s play worst case scenario, someone with a website who just discovered SEO finds website grader runs it on their website and gets a “good score”. They figure their website is good enough and leaves things alone, chances are that person won’t have the luxury of defensible traffic that engadget does, and could really use the extra traffic that and SEO can bring to the party …
Lastly I’ll leave you with this quote
“Computers are incredibly fast, accurate and stupid; humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.” — Albert Einstein