I know a robots.txt in the root dir will stop spiders from doing this - not 
sure if proxies are intelligent enough to reuse that, but it might be worth a 
try.

cheers
dim

On Fri, 29 Jun 2001 04:44, you wrote:
> Hi All,
>
> Every day I get hundreds of hits on my JSP pages from proxy servers who are
> trying to determine if the links are still good. This is great for my
> static pages, but on pages with forms and processing logic it causes havoc.
> For example, if I have 3 pages of forms and the final page adds something
> to my database, hitting just the second page throws errors.
>
> I know that there is a pragma directive I need to add to each page, but
> isn't there also something that can be added to the HTTP header each time.
> And if so, what is the easiest way to add this to every outgoing header?
>
> Thanks,
>
> David M Rosner

Reply via email to