On Wed, 7 Mar 2007 04:14:57 -0700 "Kirk Haines" <[EMAIL PROTECTED]> wrote:
> > By the way, check the errors section of httperf report, and the > > production.log. See if there are "fd_unavailable" socket errors in > > the former, and probably some complaints about "too many files > > open" in the latter. If there are, you need to either increase the > > number of file descriptors in the Linux kernel, or decrease the max > > number of open sockets in the Mongrel(s), with -n option. I don't > > know if it solves the "RAM footprint growing to 150 Mb" problem... > > I will know it first thing tomorrow morning :) > > No. That is probably happening because of the file descriptor limit > in Ruby. Your Mongrel has accepted as many connections as Ruby can > handle; it is out of descriptors. What file descriptor limit are you referring to? A typical Linux <default> ulimit on file descriptors is 1024, which should be more than enough for the test Ken is performing. Also, I would recommend doing a test where you separate Mongrel from Rails. Use a simple Mongrel handler like the one found here: http://mongrel.rubyforge.org/rdoc/index.html require 'mongrel' class SimpleHandler < Mongrel::HttpHandler def process(request, response) response.start(200) do |head,out| head["Content-Type"] = "text/plain" out.write("hello!\n") end end end h = Mongrel::HttpServer.new("0.0.0.0", "3000") h.register("/test", SimpleHandler.new) h.register("/files", Mongrel::DirHandler.new(".")) h.run.join This will possibly narrow down the problem area. If Mongrel itself is to blame then you should still see lots-o-memory growth. Or it is the interface with Rails that is causing the problem. Jim Powers _______________________________________________ Mongrel-users mailing list [email protected] http://rubyforge.org/mailman/listinfo/mongrel-users
