In message <[EMAIL PROTECTED]>, Steve Delaney <[EMAIL PROTECTED]> writes >I need to maintain copies of somw 2000+ pages on the server for SOE >purposes. The application I inherited supposedly did that at one time, >but was relocated to another server. Now of course all pages are >generated and sent to the local browser - how do I keep a copy for the >server? -sd
Assuming that you mean SEO purposes, you seem to be misunderstanding something. In the last century <G>, search engines could not list pages which were dynamic, but since most sites are dynamic nowadays, the search engines would be pretty empty if dynamic sites were not included. For example, go to Google, and type site:www.hotcosta.com (one of my sites) into the search box. I have 10,000 pages listed, and the site is 100% generated. Try that with your own site, and see what you get. As you say, when someone visits your site, a copy of the page is generated and sent to the browser. But when a search engine robot visits your page, exactly the same thing happens. If all your site is accessible by regular links - not Javascript, Flash, or dropdowns - then you won't have a problem. And if parts of your site are not accessible by regular links, make a sitemap, and cover it that way. -- Pete Clark Sunny Andalucia http://hotcosta.com/Andalucia.Spain