Shawn wrote:
Hi, I have been trying to figure out a way to limit the massive amount
of bandwidth that search bots (Googlebot/2.1) consume daily from my
website. My problem is that I am running Apache::ASP and about 90% of
Doesn't Apache::Throttle, configured in apache to kick as a fixuphandler
On Wed, 2004-01-28 at 11:42, Shawn wrote:
> I guess the only true way to tell if this is working will
> be to check the access logs and what response codes were given back to
> googlebot over time.
You could just test it with LWP by setting your UserAgent to be whatever
Google uses.
- Perrin
--
till a company ran by people who
> actually liked their jobs =)
>
> -Original Message-
> From: Josh Chamas [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, January 27, 2004 7:41 PM
> To: Shawn
> Cc: [EMAIL PROTECTED]
> Subject: Re: Search Bot
>
>
> Shawn wrote:
&
Cc: [EMAIL PROTECTED]
Subject: Re: Search Bot
Shawn wrote:
> Hi, I have been trying to figure out a way to limit the massive amount
> of bandwidth that search bots (Googlebot/2.1) consume daily from my
> website. My problem is that I am running Apache::ASP and about 90% of
> the sit
over time.
Thanks for your input
Shawn
-Original Message-
From: Josh Chamas [mailto:[EMAIL PROTECTED]
Sent: Tuesday, January 27, 2004 6:41 PM
To: Shawn
Cc: [EMAIL PROTECTED]
Subject: Re: Search Bot
Shawn wrote:
> Hi, I have been trying to figure out a way to limit the massive amo
Shawn wrote:
> Hi, I have been trying to figure out a way to limit the massive amount
> of bandwidth that search bots (Googlebot/2.1) consume daily from my
> website. My problem is that I am running Apache::ASP and about 90% of
> the site is dynamic content, links such as product.htm?id=100. The
Shawn wrote:
Hi, I have been trying to figure out a way to limit the massive amount
of bandwidth that search bots (Googlebot/2.1) consume daily from my
website. My problem is that I am running Apache::ASP and about 90% of
the site is dynamic content, links such as product.htm?id=100. The
dynami
Title: Search Bot
Hi, I have been trying to figure out a way to limit the massive amount of bandwidth that search bots (Googlebot/2.1) consume daily from my website. My problem is that I am running Apache::ASP and about 90% of the site is dynamic content, links such as product.htm?id=100