Re: HAproxy Tuning - HTTP requests loading serially

2011-02-04 Thread Willy Tarreau
Hi David,

On Wed, Feb 02, 2011 at 05:42:20PM -0800, David Tosoff wrote:
> Thanks Peres,
> 
> I had tried that already. serv-close actually took my sites and games down to 
> a 
> crawl.
> 
> I tried pretend keep-alives as well and removing httpclose all together; same 
> result; seems best with httpclose enabled.
> 
> Any other Ideas/suggestions?

Before 1.4.9, combining http-server-close with pretend-keepalive could
result in what you observed because the server sometimes did not send a
close in the response, and the client thought it wanted to stay alive.
But some servers also failed to send a content-length or transfer-encoding
if they got a close, resulting in a stupid situation where the client has
to wait for the server to actively close to detect the end. 1.4.9 fixed
that but introduced a new similar issue when combining httpclose with
pretend-keepalive. All those are fixed in latest -git which will soon
be released as 1.4.11.

Anyway, 1.4.10 with http-server-close + http-keepalive SHOULD be OK.
Please tell us if you still get the issue with that version.

> On another note, I notice that haproxy strips out/uncompresses any gzipping 
> the 
> server replies with. Could this be related?

Yes, though it's not haproxy which strips it, it's that there was a
bug in at least one server (tomcat) which decided that an HTTP/1.1
request with "Connection: close" was equivalent to an HTTP/1.0 request
(which it is not). So due to this, it refrained from using transfer
encoding which is needed to send compressed contents. This bug was
reported to the tomcat team and fixed in latest version if I understood
it right. This is what led us to implement the "pretend-keepalive"
option. It's possible that other servers have the same bug though
I don't have a list.

Cheers,
Willy




Re: HAproxy Tuning - HTTP requests loading serially

2011-02-02 Thread David Tosoff
Thanks Peres,

I had tried that already. serv-close actually took my sites and games down to a 
crawl.

I tried pretend keep-alives as well and removing httpclose all together; same 
result; seems best with httpclose enabled.

Any other Ideas/suggestions?

On another note, I notice that haproxy strips out/uncompresses any gzipping the 
server replies with. Could this be related?

Cheers,

David




From: Peres Levente 
To: David Tosoff 
Sent: Wed, February 2, 2011 3:00:16 AM
Subject: Re: HAproxy Tuning - HTTP requests loading serially

Hi David,

Just a blind guess, but...

I don't know how your app handles keepalive, but maybe try losing "option 
httpclose" and use "option http-server-close" instead, thus allowing proper 
keepalive. This gave me an immediate performance boost for a webfarm that 
serves  literally millions of requests per minute - albeit at the cost of 
having to deal with about a hundred thousand unproperly-closed stale 
connections simultaneously on the gateways - which can be taken care of 
with 
timeout and sysctl settings.

Levente

2011.02.02. 0:30 keltezéssel, David Tosoff írta: 
Hello All,
>
>
>I've recently   setup a pair of haproxy VMs (running in Hyper-V) over 
>Ubuntu   10.10 w/ keepalived
>
>
>Things seem to   be working pretty well. We're moving 35-50Mbps (1442  
> 
>concurrent sessions avg) thru the primary node all day, but   we're 
>noticing that multiple concurrent http requests from the   client seem 
>like they're being responded to serially.
>
>
>For example, we   run a 3D game that issues http requests for in-world 
>resources   (textures, maps, images) from the client to the web 
>servers   
>through HAproxy. When we log into the game, we see multiple   blank 
>areas on the walls that load one-by-one, slowly,   serially. When we 
>bypass HAproxy, everything will load   immediately. Oddly enough, 
>individual request thru haproxy are   very fast: 65K resource file 
>downloads in 0.17seconds; but the   next resource doesn't load until 
>the 
>previous is complete...
>
>
>Is there a limit   of how many concurrent (http or otherwise) 
>connections to   haproxy/linux a client can have?
>Can you point me   to any performance tweaks I can place in Ubuntu or 
>Haproxy   that will help with this?
>
>
>Thanks in   advance!
>David
>
>
>[CURRENT CONFIG]
>global
>daemon
>user haproxy
>maxconn 10
>pidfile   /etc/haproxy/haproxy.pid
>stats socket   /tmp/haproxy.stat level admin
>
>
>defaults
>mode http
>timeout connect 5000ms
>timeout client 5ms
>timeout server 5ms
>retries 3
>option redispatch
>option httpclose
>option forwardfor
>
>
>backend WEBFARM
>balance leastconn
>cookie HAP-SID insert   indirect maxidle 120m
>option httpchk GET   /check.aspx HTTP/1.0
>http-check expect string   SUCCESS
>server TC-IIS-2 10.4.1.22:80   cookie TI2 check
>server TC-IIS-3 10.4.1.23:80   cookie TI3 check
>server TC-IIS-4 10.4.1.24:80   cookie TI4 check
>server TC-IIS-5 10.4.1.25:80   cookie TI5 check
>server TC-IIS-6 10.4.1.26:80   cookie TI6 check
>server TC-IIS-7 10.4.1.27:80   cookie TI7 check
>
>
>frontend HYBRID_WEBS
>default_backend WEBFARM
>bind 127.0.0.1:80 name LOCAL
>bind 10.4.0.10:80 name   HYBRID_WEBS
>
>
>

>avast! Antivirus: Inbound   message clean. 
>Virus Database (VPS):   110201-0, 2011.02.01
>Tested on: 2011.02.02. 0:32:27
>avast! - copyright (c) 1988-2011 AVAST Software.
>




HAproxy Tuning - HTTP requests loading serially

2011-02-01 Thread David Tosoff
Hello All,

I've recently setup a pair of haproxy VMs (running in Hyper-V) over Ubuntu 
10.10 
w/ keepalived

Things seem to be working pretty well. We're moving 35-50Mbps (1442 concurrent 
sessions avg) thru the primary node all day, but we're noticing that multiple 
concurrent http requests from the client seem like they're being responded to 
serially.

For example, we run a 3D game that issues http requests for in-world resources 
(textures, maps, images) from the client to the web servers through HAproxy. 
When we log into the game, we see multiple blank areas on the walls that load 
one-by-one, slowly, serially. When we bypass HAproxy, everything will load 
immediately. Oddly enough, individual request thru haproxy are very fast: 65K 
resource file downloads in 0.17seconds; but the next resource doesn't load 
until 
the previous is complete...

Is there a limit of how many concurrent (http or otherwise) connections to 
haproxy/linux a client can have?
Can you point me to any performance tweaks I can place in Ubuntu or Haproxy 
that 
will help with this?

Thanks in advance!
David

[CURRENT CONFIG]
global
daemon
user haproxy
maxconn 10
pidfile /etc/haproxy/haproxy.pid
stats socket /tmp/haproxy.stat level admin

defaults
mode http
timeout connect 5000ms
timeout client 5ms
timeout server 5ms
retries 3
option redispatch
option httpclose
option forwardfor

backend WEBFARM
balance leastconn
cookie HAP-SID insert indirect maxidle 120m
option httpchk GET /check.aspx HTTP/1.0
http-check expect string SUCCESS
server TC-IIS-2 10.4.1.22:80 cookie TI2 check
server TC-IIS-3 10.4.1.23:80 cookie TI3 check
server TC-IIS-4 10.4.1.24:80 cookie TI4 check
server TC-IIS-5 10.4.1.25:80 cookie TI5 check
server TC-IIS-6 10.4.1.26:80 cookie TI6 check
server TC-IIS-7 10.4.1.27:80 cookie TI7 check

frontend HYBRID_WEBS
default_backend WEBFARM
bind 127.0.0.1:80 name LOCAL
bind 10.4.0.10:80 name HYBRID_WEBS