Hi,
On Mon, Nov 16, 2009 at 04:33:34PM +0100, Wout Mertens wrote:
> Schweet! I'll give that a shot.
If you want to experiment a bit, with version 1.4 (development),
you can even add a delay to all the requests from this boat. The
idea is to identify the bot with an ACL and tell the TCP layer
to w
Schweet! I'll give that a shot.
Wout.
On Nov 16, 2009, at 4:08 PM, Karsten Elfenbein wrote:
> you can just create the backend in haproxy and use the same backend server
> definition
> no need to reconfigure apache
>
> put like 7 max sessions for normal users on one backend and 2 for maxsession
you can just create the backend in haproxy and use the same backend server
definition
no need to reconfigure apache
put like 7 max sessions for normal users on one backend and 2 for maxsessions
on the bot backend
throw in some queues and you are set
Karsten
Am Montag, 16. November 2009 schrieb
Perhaps this plugin could be useful, never used, tho:
http://twiki.org/cgi-bin/view/Plugins.TWikiCacheAddOn
On Mon, Nov 16, 2009 at 11:46 AM, Wout Mertens wrote:
> On Nov 16, 2009, at 1:47 PM, Karsten Elfenbein wrote:
>
> > Just create an additional backend and assign the bots to it.
> > You can
On Nov 16, 2009, at 1:47 PM, Karsten Elfenbein wrote:
> Just create an additional backend and assign the bots to it.
> You can set queues and max connections there as needed.
Yes, you're right - that's probably the best solution. I'll create an extra
apache process on the same server that will h
s and limit on the number
of connections / sec based on ip addresses...
> -Original Message-
> From: Wout Mertens [mailto:wout.mert...@gmail.com]
> Sent: Monday, November 16, 2009 9:19 AM
> To: John Lauro
> Cc: haproxy@formilux.org
> Subject: Re: Preventing bots from starvi
you at all.
-JohnF
> -Original Message-
> From: Wout Mertens [mailto:wout.mert...@gmail.com]
> Sent: November 16, 2009 9:19 AM
> To: John Lauro
> Cc: haproxy@formilux.org
> Subject: Re: Preventing bots from starving other users?
>
> On Nov 16, 2009, at 2:4
On Nov 16, 2009, at 2:43 PM, John Lauro wrote:
> Oopps, my bad... It's actually tc and not iptables. Googletc qdisc
> for some info.
>
> You could allow your local ips go unrestricted, and throttle all other IPs
> to 512kb/sec for example.
Hmmm... The problem isn't the data rate, it's the
hn Lauro
> Cc: haproxy@formilux.org
> Subject: Re: Preventing bots from starving other users?
>
> Hi John,
>
> On Nov 15, 2009, at 8:29 PM, John Lauro wrote:
>
> > I would probably do that sort of throttling at the OS level with
> iptables,
> > etc...
>
> H
If the bot conforms why not just control its behavior by specifying
restrictions in your robots.txt?
http://www.robotstxt.org/
On Sun, Nov 15, 2009 at 9:57 AM, Wout Mertens wrote:
> Hi there,
>
> I was wondering if HAProxy helps in the following situation:
>
> - We have a wiki site which is quit
Just create an additional backend and assign the bots to it.
You can set queues and max connections there as needed.
Also an additional tip might be to adjust the robots.txt file as some bots can
be slowed down.
http://www.google.com/support/webmasters/bin/answer.py?answer=48620
Check if the bots
Hi John,
On Nov 15, 2009, at 8:29 PM, John Lauro wrote:
> I would probably do that sort of throttling at the OS level with iptables,
> etc...
Hmmm How? I don't want to throw away the requests, just queue them. Looking
for iptables rate limiting it seems that you can only drop the request.
2009/11/15 Wout Mertens :
> I was wondering if HAProxy helps in the following situation:
>
> - We have a wiki site which is quite slow
> - Regular users don't have many problems
> - We also get crawled by a search bot, which creates many concurrent
> connections, more than the hardware can handle
On Son 15.11.2009 15:57, Wout Mertens wrote:
Hi there,
I was wondering if HAProxy helps in the following situation:
- We have a wiki site which is quite slow
- Regular users don't have many problems
- We also get crawled by a search bot, which creates many concurrent
connections, more than the
I would probably do that sort of throttling at the OS level with iptables,
etc...
That said, before that I would investigate why the wiki is so slow...
Something probably isn't configured right if it chokes with only a few
simultaneous accesses. I mean, unless it's embedded server with under 32MB
15 matches
Mail list logo