Hi John,
On Nov 15, 2009, at 8:29 PM, John Lauro wrote:
I would probably do that sort of throttling at the OS level with iptables,
etc...
Hmmm How? I don't want to throw away the requests, just queue them. Looking
for iptables rate limiting it seems that you can only drop the request.
Just create an additional backend and assign the bots to it.
You can set queues and max connections there as needed.
Also an additional tip might be to adjust the robots.txt file as some bots can
be slowed down.
http://www.google.com/support/webmasters/bin/answer.py?answer=48620
Check if the
@formilux.org
Subject: Re: Preventing bots from starving other users?
Hi John,
On Nov 15, 2009, at 8:29 PM, John Lauro wrote:
I would probably do that sort of throttling at the OS level with
iptables,
etc...
Hmmm How? I don't want to throw away the requests, just queue them.
Looking
-Original Message-
From: Wout Mertens [mailto:wout.mert...@gmail.com]
Sent: November 16, 2009 9:19 AM
To: John Lauro
Cc: haproxy@formilux.org
Subject: Re: Preventing bots from starving other users?
On Nov 16, 2009, at 2:43 PM, John Lauro wrote:
Oopps, my bad... It's
Perhaps this plugin could be useful, never used, tho:
http://twiki.org/cgi-bin/view/Plugins.TWikiCacheAddOn
On Mon, Nov 16, 2009 at 11:46 AM, Wout Mertens wout.mert...@gmail.comwrote:
On Nov 16, 2009, at 1:47 PM, Karsten Elfenbein wrote:
Just create an additional backend and assign the bots
Hi there,
I was wondering if HAProxy helps in the following situation:
- We have a wiki site which is quite slow
- Regular users don't have many problems
- We also get crawled by a search bot, which creates many concurrent
connections, more than the hardware can handle
- Therefore, service is
32MB
of RAM, the hardware should be able to handle that...
-Original Message-
From: Wout Mertens [mailto:wout.mert...@gmail.com]
Sent: Sunday, November 15, 2009 9:57 AM
To: haproxy@formilux.org
Subject: Preventing bots from starving other users?
Hi there,
I was wondering if HAProxy
On Son 15.11.2009 15:57, Wout Mertens wrote:
Hi there,
I was wondering if HAProxy helps in the following situation:
- We have a wiki site which is quite slow
- Regular users don't have many problems
- We also get crawled by a search bot, which creates many concurrent
connections, more than
2009/11/15 Wout Mertens wout.mert...@gmail.com:
I was wondering if HAProxy helps in the following situation:
- We have a wiki site which is quite slow
- Regular users don't have many problems
- We also get crawled by a search bot, which creates many concurrent
connections, more than the
9 matches
Mail list logo