Tips for Ajax for SEO

Whenever Ajax enter the conversation for an SEO or internet marketer, chances are good there will always be a deep sigh or an “ugh” face. While it is true that search engines are getting better at indexing this type of content, we still aren’t at the point that you can realistically rely on them to index it properly or even at all. That doesn’t mean you can’t use it, it just means you need to take some extra steps to make sure that type of content is visible to crawlers and non Ajax users.

The first step you need to do is make sure a static page and URL exists for every end result of content. For example, let’s say you run a local travel website, and you have a location/map page that lets people view restaurants, hotels, attractions, or other information within a specific area. Users can turn filters on or off, look at different locations, and get detailed information about each venue. It would be a very good user experience to have that work via Ajax and JavaScript, similar to Google maps integrating data from Google places. However “hiding” all that information behind Ajax won’t help you with your organic search traffic.

What you need to do is create specific unique URLs for each of those destinations. These URLs need to provide the information in way that ALL the search engine spiders can read and extract, not just the advanced experimental Ajax crawling spider from Google. This insures you will get traffic from Yahoo, Bing, Facebook, Twitter, Stumbleupon, and, heck, even services like Blekko* and Wolfram Alpha. Relying on just one search engine or source for your traffic is a dangerous strategy and not defensible in the whims of an algorithm update.(*Update Blekko notified me they have an Ajax Crawler)

Once you have each of those pages, you want to make sure the URL is as search engine friendly as possible: short with 3-5 keywords in the URL and without parameters. While it’s a bit of overkill, providing the rel=”canonical” tag is a good idea as well.

Where things get a little tricky is inbound linking, email, social media links, and user agent detection. Whether someone is viewing the Ajax version of the content or the static version of the content, you should provide a “link to this page,” “share this page,” or “email this page” functionality, and that should always go to the static URL.

When users request those pages or come from a search engine and ask for the static URL page, you need to make a decision about how to serve that content. If the user agent is capable of working with Ajax/JavaScript, feel free to serve it that way. If it’s a bot or non compatible user agent (ie tablet, iPad, or mobile phone) then serve the HTML version. Lastly, I would always fail gracefully with a noscript tag that, when clicked, assures the users gets the content they really want.

While this may seem like a bit of double work, if you use Ajax properly, it’s probably not. You pull the same information from the same database–it’s only the method of rendering that changes. Flash, on the other hand, will be a bit more problematic, and would probably require a bit of double work. Therefore, it’s not a method I recommend. One of the primary reasons it’s a good idea to pull the data from same DB is it insures you don’t create a “bad cloaking” situation. Technically, cloaking is serving different content to the spiders and to the engines. If the actual content is the same, and it’s just the delivery technology and implementation that is the only difference, you have a low risk, highly defensible position. Especially if you use the canonical tag to nudge the spiders in the direction of the real URL.

Once you have the static URL in place, you need to provide a method for the search engines to see and access that content. You can use HTML sitemaps and XML sitemaps, but ideally you need to set up dedicated crawling paths. Unless your site is very small (less than a few hundred pages), I would suggest a limited test first. You should roll this out in phases on non mission critical sections of pages first. Use text browsers, text viewers, crawlers like Xenu link sleuth, or website auditor. Lastly, I would suggest setting up a monitoring page for use with services like change detection and/or Google alerts. It’s important that you know if something “breaks” or “jumps the rails” within 24 hours, not 30 days later when 70% of your content has dropped out of the index.

The last issue you want to consider is internal duplicate content. It’s not entirely unlikely that if the “Ajax crawling bot” finds its way to your pages, you don’t want them to be interested in it and index the content in that format. Using the rel=”canonical” tag that points to a static non-ajax URL will help, but I’ d also suggest the noindex, follow meta tags on the Ajax pages, just to be safe. Leaving things open to search engines to decide is where problems come from … sometimes BIG and EXPENSIVE problems …

So what are the takeaways from this post:

  • Ajax isn’t evil, but the implementation is going to be more difficult and complex, so be smart about how you do it
  • Province distinct static unique URLs that are accessible from the Ajax pages
  • Use user detection to serve the best version OF THE SAME content
  • Use spider simulators to insure you are calling the right version
  • Use change detection and monitoring to detect problems with indexing quickly and correct them before your website falls off the map.

photo credit: Shutterstock/Serg Zastavkin

GraywolfSEO.com runs on the Genesis Framework

Genesis Framework

Genesis lets you to quickly and easily build amazing websites with WordPress. Whether you're a novice or advanced developer, Genesis provides the secure and search-engine-optimized foundation that takes WordPress to places you never thought it could go.
It's that simple - start using Genesis now!


Take advantage of the 6 default layout options, comprehensive SEO settings, rock-solid security, flexible theme options, cool custom widgets, custom design hooks, and a huge selection of child themes ("skins") that make your site look the way you want it to. With automatic theme updates and world-class support included, Genesis is the smart choice for your WordPress website or blog.


tla starter kit

Advertisers:

  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Need an SEO Audit for your website, look at my SEO Consulting Services
  4. TigerTech - Great Web Hosting service at a great price.
More in SEO
When is Using Microsites a Good Idea

Microsites are often abused and looked down on as a tactic in the SEO world. They get used for the...

How to Create a Custom 404 Page

If you've spent any time on the internet, chances are you've seen silly, funny, or just plain insane custom 404...

Who Are You Writing Blog Posts For

Who are you writing your blog posts for ... people or search engines? Depending on your intended audience your posts...

Close