On Wed, 17 Jan 2001, Sam Horrocks wrote:
If in both the MRU/LRU case there were exactly 10 interpreters busy at
all times, then you're right it wouldn't matter. But don't confuse
the issues - 10 concurrent requests do *not* necessarily require 10
concurrent interpreters. The MRU has an
This is planned for a future release of speedycgi, though there will
probably be an option to set a maximum number of bytes that can be
bufferred before the frontend contacts a perl interpreter and starts
passing over the bytes.
Currently you can do this sort of acceleration with script output
"Jeremy Howard" [EMAIL PROTECTED] wrote:
A backend server can realistically handle multiple frontend requests, since
the frontend server must stick around until the data has been delivered
to the client (at least that's my understanding of the lingering-close
issue that was recently discussed
Roger Espel Llima [EMAIL PROTECTED] writes:
"Jeremy Howard" [EMAIL PROTECTED] wrote:
I'm pretty sure I'm the person whose words you're quoting here,
not Jeremy's.
A backend server can realistically handle multiple frontend requests, since
the frontend server must stick around until the
Perrin Harkins wrote:
What I was saying is that it doesn't make sense for one to need fewer
interpreters than the other to handle the same concurrency. If you have
10 requests at the same time, you need 10 interpreters. There's no way
speedycgi can do it with fewer, unless it actually makes
On Thu, 21 Dec 2000, Sam Horrocks wrote:
Folks, your discussion is not short of wrong statements that can be easily
proved, but I don't find it useful.
I don't follow. Are you saying that my conclusions are wrong, but
you don't want to bother explaining why?
Would you agree with
Perrin Harkins wrote:
Keith Murphy pointed out that I was seeing the result of persistent HTTP
connections from my browser. Duh.
I must mention that, having seen your postings here over a long period,
anytime I can make you say "duh", my week is made. Maybe the whole
month.
That issue
"Jeremy Howard" [EMAIL PROTECTED] writes:
Perrin Harkins wrote:
What I was saying is that it doesn't make sense for one to need fewer
interpreters than the other to handle the same concurrency. If you have
10 requests at the same time, you need 10 interpreters. There's no way
At 10:17 PM 12/22/2000 -0500, Joe Schaefer wrote:
"Jeremy Howard" [EMAIL PROTECTED] writes:
[snipped]
I posted a patch to modproxy a few months ago that specifically
addresses this issue. It has a ProxyPostMax directive that changes
it's behavior to a store-and-forward proxy for POST data (it
Joe Schaefer wrote:
"Jeremy Howard" [EMAIL PROTECTED] writes:
I don't know if Speedy fixes this, but one problem with mod_perl v1 is
that
if, for instance, a large POST request is being uploaded, this takes a
whole
perl interpreter while the transaction is occurring. This is at least
one
On Thu, 21 Dec 2000, Sam Horrocks wrote:
Folks, your discussion is not short of wrong statements that can be easily
proved, but I don't find it useful.
I don't follow. Are you saying that my conclusions are wrong, but
you don't want to bother explaining why?
Would you agree
Folks, your discussion is not short of wrong statements that can be easily
proved, but I don't find it useful. Instead please read:
http://perl.apache.org/~dougm/modperl_2.0.html#new
Too quote the most relevant part:
"With 2.0, mod_perl has much better control over which PerlInterpreters
are
On Thu, 21 Dec 2000, Sam Horrocks wrote:
Folks, your discussion is not short of wrong statements that can be easily
proved, but I don't find it useful.
I don't follow. Are you saying that my conclusions are wrong, but
you don't want to bother explaining why?
Would you agree
Hi Sam,
Processes 1, 2, 3 are running. 1 finishes and requests the mutex, then
2 finishes and requests the mutex, then 3 finishes and requests the mutex.
So when the next three requests come in, they are handled in the same order:
1, then 2, then 3 - this is FIFO or LRU. This is bad
On Thu, 21 Dec 2000, Ken Williams wrote:
So in a sense, I think you're both correct. If "concurrency" means
the number of requests that can be handled at once, both systems are
necessarily (and trivially) equivalent. This isn't a very useful
measurement, though; a more useful one is how
15 matches
Mail list logo