My experience is that increasing the number of threads past 4 does
absolutely nothing to assist performance. Multi threads will help with
the occasional long running request, but if its more than occasional
then the thread list will eventually clog up no matter how many threads
you have (and the
Have you tried looking at it with the deadlock debugger? That should
give you an idea of what its actually running (and if the deadlock
debugger is usually responsive even when zope isn't and if it isn't
that's also information).
Kevin
-Original Message-
From: [EMAIL PROTECTED] [mailto:
> This is relevant:
> http://docs.python.org/api/threads.html
>
http://www.pyzine.com/Issue001/Section_Articles/article_ThreadingGlobalI
nterpreter.html
> But notice some C extensions do allow concurrency by releasing the
GIL.
> Most I/O operations in the standard library do that. And probably th
What it sounds like is that Python is using a "sequential" threading
model. I don't know about Python in particular, but some languages
implement a sequential model internally so that multithreaded programs
will run correctly (if not effeciently) in the absense of whatever
thread libraries the lan
> Four: the idea is to split up the processes so that their threads
make best
> use of the available CPUs. Python's global interpreter lock makes
using more
> than one CPU within a single process problematic.
In the multiprocess setup, is it better to run a single thread per
process or mult
I apologize if this has been asked ad nauseam. I've uncovered enough
information online to suggest that it has, but too little to get any
kind of definitive answer. My problem is that I am running a number of
zope instances against a zeo instance. I would like to store the
session data in the Z