Hello All,
I've recently setup a pair of haproxy VMs (running in Hyper-V) over Ubuntu
10.10
w/ keepalived
Things seem to be working pretty well. We're moving 35-50Mbps (1442 concurrent
sessions avg) thru the primary node all day, but we're noticing that multiple
concurrent http requests from the client seem like they're being responded to
serially.
For example, we run a 3D game that issues http requests for in-world resources
(textures, maps, images) from the client to the web servers through HAproxy.
When we log into the game, we see multiple blank areas on the walls that load
one-by-one, slowly, serially. When we bypass HAproxy, everything will load
immediately. Oddly enough, individual request thru haproxy are very fast: 65K
resource file downloads in 0.17seconds; but the next resource doesn't load
until
the previous is complete...
Is there a limit of how many concurrent (http or otherwise) connections to
haproxy/linux a client can have?
Can you point me to any performance tweaks I can place in Ubuntu or Haproxy
that
will help with this?
Thanks in advance!
David
[CURRENT CONFIG]
global
daemon
user haproxy
maxconn 100000
pidfile /etc/haproxy/haproxy.pid
stats socket /tmp/haproxy.stat level admin
defaults
mode http
timeout connect 5000ms
timeout client 50000ms
timeout server 50000ms
retries 3
option redispatch
option httpclose
option forwardfor
backend WEBFARM
balance leastconn
cookie HAP-SID insert indirect maxidle 120m
option httpchk GET /check.aspx HTTP/1.0
http-check expect string SUCCESS
server TC-IIS-2 10.4.1.22:80 cookie TI2 check
server TC-IIS-3 10.4.1.23:80 cookie TI3 check
server TC-IIS-4 10.4.1.24:80 cookie TI4 check
server TC-IIS-5 10.4.1.25:80 cookie TI5 check
server TC-IIS-6 10.4.1.26:80 cookie TI6 check
server TC-IIS-7 10.4.1.27:80 cookie TI7 check
frontend HYBRID_WEBS
default_backend WEBFARM
bind 127.0.0.1:80 name LOCAL
bind 10.4.0.10:80 name HYBRID_WEBS