>>Err, sorry Fuad, but you are wrong. Even Google disagrees with you /grin! - they happily crawl my company's website and return fully fleshed out pages (from their cache!) from our Ajax-based website. However, they don't seem to be willing to share their secrets /sigh.
- I wrote about "Search Engine Friendliness". I meant: different content for different agent signatures. AJAX for IE-8, and HTML for GoogleBot. And very specific HTML for people with disabilities. And specific HTML for those who hate JavaScript. Yes, my Laptop allows to generate HTML with color, images, and etc from multiple chunk of data, but it's different story... Googlebot will have to spend 10000 times more CPU cycles than right now to do the same. Double-core for single page (AJAX, whatever) + 1-2-5 seconds to generate a page (HtmlUnit, Mozilla, whatever) is OK for a single page (and 2-3 seconds), but it is not Ok for Googlebot (few milliseconds per page; billions pages per day). Your website probably uses extremely basic "deterministic" AJAX: dynamically loaded HTML snippets, and each such snippet can have static (search-engine-friendly) URL, and each snippet has embedded HTML tags. What about REAL AJAX application? What about Adobe ActionScript (with specific Search Engine API? It failed...) Are you sure about "fully fleshed out pages"? What is was, JSON objects converted to DOM using very specific "transformation"!? Yes, Google can "emulate" initial computer screen with CSS and etc; with basic "OnLoad"-generated staff; but it requires a lot of CPU so that it can do it for home pages only of most important sites. And JSON... subject of discussion is similar to "form submission", can GoogleBot discover ALL imaginable URLs generated as form submissions (including JavaScript-generated URLs)? Even if you don't publish such URLs explicitly as part of SEO strategy? And what if it can, it won't play any role: zero-rank since no any incoming links...
