I was tipped off to Wikipedia Eats Google, through Steve Rubel’s Wikipedia all over Google. One his main proving points is this special query over at Google [* *] which is intresting, and this is the authors line of reasoning:
Google is becoming increasingly prone to Wikipedia. This is because Google’s PageRanktm algorithm, the method by which it ranks search pages, inherently succumbs to the basic structure and social structure of wikis.
The PageRank algorithm is most famously characterized as valuing links that are highly referred to by other people. It seems that is only part of the story. The PageRank algorithm values links to yourself more. That is, a website that has many pages and is densely inter-woven with links becomes a sort of PageRank machine. True, without other websites conferring a little bit of their PageRank onto it, that website will not have a high PageRank, but given enough of a small number of external links from mediocre websites pointing to your very large, densely interwoven website, your website will shoot up through the listings.
However in true Freakonomics (no aff) style I’m going to say the author misses a big part of the equation. There have been many back and forth debates over whether linking out to unique, and authoritative web documents help your rankings (see Hilltop Analysis), however I can tell you I’m a firm believer in the concept, as are many other SEO’s. In fact many who practice the dark arts also link to authority websites. As a result of Google’s Sandbox filter I can tell you my site building skills have improved dramatically. I can crank out websites with hundreds of empty pages waiting for copy in a few days. Now if I were a true button pushing spammer I’d be filling these with scraped and uniquified content, and I’d certainly be linking to authority websites like Wikipedia, to make my site appear more “hub like” and rise in the rankings.
Lets bring it all together, Google gives ancillary clues that linking to authority websites improves rankings. Google delays ranking new websites, with little trust or authority (aka sandbox). To make up for lost income, site production increases. More autogenerated sites point to authority websites like Wikipedia. More sites linking to Wikipedia increase it’s authority score and it rises in rankings, across a wide range of categories, and SERPS. Google’s reliance on link based algorithm’s, coupled with overly restrictive trust ranking filters, decrease the diversity of sites in its SERPS, which IMHO is a very bad thing.