Hi Thrawn,
I tried these configs, and there doesn't seem to be much if any
difference. The tcp one might even be the slowest in my limited
virtualized tests, but only my a few milliseconds..
frontend lua-replyip
bind 192.168.0.120:9010
mode http
http-request use-service lua.lua-replyip
frontend lua-replyip-copy
bind 192.168.0.120:9011
mode tcp
tcp-request content use-service lua.lua-replyip-tcp
frontend lua-replyip-httpreq
bind 192.168.0.120:9012
mode http
http-request lua.lua-replyip-http-req
core.register_service("lua-replyip", "http", function(applet)
local response = applet.f:src()
applet:set_status(200)
applet:add_header("Server", "haproxy-lua/echo")
applet:add_header("content-length", string.len(response))
applet:add_header("content-type", "text/plain")
applet:start_response()
applet:send(response)
end)
core.register_service("lua-replyip-tcp", "tcp", function(applet)
local buffer = applet.f:src()
applet:send("HTTP/1.0 200 OK\r\nServer:
haproxy-lua/echo\r\nContent-Type: text/html\r\nContent-Length: " ..
buffer:len() .. "\r\nConnection: close\r\n\r\n" .. buffer)
end)
core.register_action("lua-replyip-http-req", { "http-req" }, function (txn)
local buffer = txn.f:src()
txn.res:send("HTTP/1.0 200 OK\r\nServer:
haproxy-lua/echo\r\nContent-Type: text/html\r\nContent-Length: " ..
buffer:len() .. "\r\nConnection: close\r\n\r\n" .. buffer)
txn:done()
end)
Op 11-11-2015 om 3:07 schreef Thrawn:
Hmm...I seem to be able to set up something in TCP mode, and it
returns the expected response via curl, but its performance is awful.
I must be doing something wrong?
Lua:
core.register_action("tcp-echo", {"tcp-req"}, function (txn)
local buffer = txn.f:src()
txn.res:send("HTTP/1.0 200 OK\r\nServer:
haproxy-lua/echo\r\nContent-Type: text/html\r\nContent-Length: " ..
buffer:len() .. "\r\nConnection: close\r\n\r\n" ..
missing the appending of 'buffer' in the end on the line above?
txn:done()
end)
I couldn't find a way for a TCP applet to retrieve the client IP
address; suggestions are welcome.
HAProxy config:
frontend tcp-echo
bind 127.0.2.1:1610
timeout client 10000
mode tcp
tcp-request content lua.tcp-echo
Testing this with ab frequently hangs and times out even at tiny loads
(10 requests with concurrency 3).
On Wednesday, 11 November 2015, 10:19, PiBa-NL <piba.nl....@gmail.com>
wrote:
b.t.w. if sole purpose of the frontend is to echo the ip back to the
client.
You should probably also check the 'use-service' applet syntax, i dont
know if that could be faster for your purpose.
Then another thing to check would be if you want to use the tcp or
http service mode. A TCP service could be almost 1 line of lua code..
And i kinda expect to be a bit faster.
http://www.arpalert.org/src/haproxy-lua-api/1.6/index.html#haproxy-lua-hello-world
Instead of sending 'hello world' you could send the client-ip..
Op 10-11-2015 om 23:46 schreef Thrawn:
OK, some explanation seems in order :).
I ran ab with concurrency 1000 and a total of 30000 requests, against
each server, 5 times, plus one run each with 150000 requests (sum of
the previous 5 tests).
For Apache+PHP, this typically resulted in 5-15ms response time for
99% of requests, with the remaining few either taking tens of seconds
or eventually disconnecting with an error.
For HAProxy+Lua, 99% response times were 1ms, or sometimes 2ms, with
the last few taking about 200ms. So, HAProxy worked much better (of
course).
However, on the larger run (150k), HAProxy too had a small percentage
of disconnections (apr_socket_recv: Connection reset by peer). I've
been able to reproduce this with moderate consistency whenever I push
it beyond about 35000 total requests. It's still a better error rate
than PHP, but I'd like to understand why the errors are occurring.
For all I know, it's a problem with ab.
I've also tried a couple of runs with 150000 requests but concurrency
only 100, and neither server had trouble serving that, although
interestingly, PHP is slightly more consistent: 99% within 4-5ms,
then about 200ms for the last few, whereas HAProxy returns 99% within
1-2ms and 1800ms for the last few.
The box is just my workstation, 8 cores and 16GB RAM, running Ubuntu
15.10, with no special tuning.
Any ideas on why the HAProxy tests showed disconnections or
occasional slow response times at high loads?
On Wednesday, 11 November 2015, 8:29, Baptiste <bed...@gmail.com>
<mailto:bed...@gmail.com> wrote:
On Tue, Nov 10, 2015 at 10:46 PM, Thrawn
<shell_layer-git...@yahoo.com.au
<mailto:shell_layer-git...@yahoo.com.au>> wrote:
> OK, I've set this up locally, and tested it against PHP using ab.
>
> HAProxy was consistently faster (99% within 1ms, vs 5-15ms for
PHP), but at
> request volumes over about 35000, with concurrency 1000, it
consistently had
> a small percentage of socket disconnections. PHP had timeouts - or
very long
> response times - and disconnections at pretty much any request
volume with
> that concurrency, but I'm wondering where the errors stem from, or
even if
> it's a limitation of ab.
>
> HAProxy config:
>
> global
> maxconn 4096
> daemon
> nbproc 1
> stats socket localhost:9461 level admin
> chroot /etc/haproxy/jail
> user haproxy
> group haproxy
> lua-load /etc/haproxy/jail/echo.lua
>
> defaults
> log 127.0.0.1 local0
> mode http
> timeout client 60000
> timeout server 60000
> timeout connect 60000
> option forwardfor
> balance roundrobin
> option abortonclose
> maxconn 20
>
> frontend echo
> bind 127.0.1.1:1610
> timeout client 10000
> mode http
> http-request lua.echo
>
> Lua:
> core.register_action("echo", { "http-req" }, function (txn)
> local buffer = txn.f:src()
> txn.res:send("HTTP/1.0 200 OK\r\nServer:
> haproxy-lua/echo\r\nContent-Type: text/html\r\nContent-Length: " ..
> buffer:len() .. "\r\nConnection: close\r\n\r\n" .. buffer)
> txn:done()
> end)
>
Hi Thrawn
I'm sorry, but I don't understand anything to all your benchmarcks!
If you could at least give an explanation before running each ab,
this may help.
Furthermore, you don't share anything about your hardware environment
neither the tuning you did on each box.
So it's impossible to help you.
At least, I can say that Lua seems to perform very well :)
Baptiste