On 12/19/06, John Dowdell <[EMAIL PROTECTED]> wrote:
hank williams wrote: > I want to apologize for the stridency of my earlier remarks. No worries, you've contributed a lot over the years, forget about it. :) I'm not sure how the various search engines do with query terms in URLs, though... some may index and/or rank on it, others may not. I don't know.
I may not have been clear. The info in the url is just the same kind of info that a regular website would use to indicated what to display. So its not that google is indexing the url. Its that the server is, when it sees an extended url, sending indexable html along with the rest of the webpage that contains the swf. In this way, URLs are used to identify a particular set of content, just like a non flash website. I do know that the various search engines explicitly warn against
"cloaking" (redirecting search results to different content), but for understandable reasons they don't go into much detail on their implementations.
I am not suggesting this. I am just suggesting that when a url is sent to the server that describes a particular content set, for example a particular myspace page like www.myspace.com/hank that the server sends along invisible html data that exactly matches what the swf is going to show once it gets up and running. It sounds like our core current problem is in figuring out how to get
search engine results for dynamic user-generated content piped through a standard interface...
Well, piped into html when an appropriate URL is delivered to a server. if someone types "salmonella" into a restaurant
review, eg. I'm still not sure whether the search engines will support such a goal or not...
They will if we can get the data into html the big thing this year for Google, Yahoo and MSN
was to support a standard "sitemap" protocol, but this is again for the main static content, rather than the ongoing contributions held within a database: http://www.sitemaps.org/
The sitemap thing solves an important half the problem. It identifies the URLs that the spider should search. But then you need the server to actually send indexable data/html when the given URLs are hit. And what I am saying is that when one of these URLs are hit, that the server needs to provide both the swf, and the invisible html that contains the indexable data that you will see in the swf. This means the content will be accessible in the swf, and accessible to the search engines in html. This is the thrust of my earlier described content. Regards Hank