Hi all,

Rob is right, the threading strategy depends on the actual server used
(Grizzly, Jetty, Simple or the NRE internal one). In general there is a pool
of worker threads. This pool has a configurable size so you can generally
limit the maximum number of concurrent threads. See this page for
configuration details:
http://www.restlet.org/documentation/1.1/connectors#jetty

For Jetty here is the list of parameters available:
http://www.restlet.org/documentation/1.1/ext/com/noelios/restlet/ext/jetty/J
ettyServerHelper

Now, be aware that web browser generally serialize the request to a single
target server/domain at least limit the number of concurrent requests sent.

If you really want to test the concurrent behavior of your Restlet
application, you should instead use a proper load tester. Here is a
benchmark that was done by Thierry on 1.0 connectors:
http://www.restlet.org/documentation/1.0/benchmark

Best regards,
Jerome  

> -----Message d'origine-----
> De : Rob Heittman [mailto:[EMAIL PROTECTED] 
> Envoyé : lundi 3 mars 2008 22:45
> À : discuss@restlet.tigris.org
> Objet : Re: Understanding Restlet's Threading Model?
> 
> 
> The mapping of incoming network connections to threads is 
> very dependent on the HTTP/HTTPS connector/server in use.  
> Restlet, as far as I know, does not do anything to attenuate 
> the native behavior of the server with regard to creating 
> threads for incoming network connections.  Which server 
> environment were you looking at when you tested?
> 
> 
> On Mon, Mar 3, 2008 at 2:43 PM, Aaron Crow 
> <[EMAIL PROTECTED]> wrote:
> 
> 
>       I'd like to understand the threading model used by my 
> basic Restlet app. I
>       have a standalone app that uses Application and 
> Component, and attaches
>       subclasses of Restlet to the router. I am using the 
> reference implementation
>       provided by noelios. (Many, many thanks to Jerome for 
> all of this!)
>       
> 
> 

Reply via email to