Duplicate Text and SMX

One of the people I finally had the chance to meet at SMX was Vanessa Fox of Google. We’ve been twittering for eva (ok maybe 6 months but on the internet that is forever) and I think we’re really close to becoming BFF.

Since I am somewhat critical of Google on a regular basis I thought it would be nice to point to some good things. I’m actually a really big advocate of Webmaster Central and not just in public or because Vanessa will pick this up because she’s ego surfing. You can even ask some of my darker shades of gray friends who will back me up on that.

Vanessa’s got a post up today (Official Google Webmaster Central Blog: Duplicate content summit at SMX Advanced) recapping some of the stuff they discussed at SMX. For the record I’d really like the ability to specify and ignore parameter in the URL, authenticate ownership, and get a duplicate content report.

Now since I’ve totally got you attention I’ll throw in a shameless question about a debate I’m having with a nameless colleague. if I put this in my robots.txt file

User-agent: *
Disallow: */print/

would it block these files


and allow these


tla starter kit


  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Need an SEO Audit for your website, look at my SEO Consulting Services
  4. TigerTech - Great Web Hosting service at a great price.