[grpc-io] Re: callback API and details on threading model

2023-03-29 Thread 'AJ Heller' via grpc.io
Sometime this year, gRPC C++ will switch to by default have a single 
auto-scaling thread pool per process, and all of gRPC's threaded activities 
will utilize it. Applications will have some control over this, though, by 
being able to provide custom EventEngine instances per channel or per 
server. 
See 
https://github.com/grpc/grpc/blob/ec1d75bb0a24a626e669696bb48490e7ac40cc69/include/grpc/event_engine/event_engine.h

The question of "how many RPCs can I operate" will depend heavily on how 
much work your services are doing, and maybe how quickly the pool needs to 
scale.

Best,
-aj
On Thursday, March 23, 2023 at 5:12:49 AM UTC-7 Timo wrote:

> Hey Zach, I thought I answered this, but seems I missed. The question is 
> about C++.
>
> Naman Shah schrieb am Montag, 26. September 2022 um 05:31:14 UTC+2:
>
>> Hey Zach, I have the same question about the implementation in CPP. 
>>
>> On Thursday, September 22, 2022 at 4:54:19 AM UTC+8 Zach Reyes wrote:
>>
>>> What language of gRPC? That'll allow me to route it to the correct 
>>> person to answer.
>>>
>>> On Sunday, September 18, 2022 at 9:29:16 AM UTC-4 Timo wrote:
>>>
 I did research on this topic but did not find detailed information in 
 the documentation yet.
 How exactly does the thread model of the new callback API work?

 When using the synchronous API, the thread model I guess is this:
 - grpc owns threads, number can be limited
 - Several RPCs can operate on one thread, but there's a limit
 - When too many RPCs are open, the client receives a "resource 
 exhausted"
 - An application with multiple clients needs at least one thread per 
 each open RPC.

 In the callback (not asynchronous) API, I understand:
 - grpc owns threads and spawns new threads if needed
 - multiple RPCs can be handled on one thread non-blocking
 For the server, I wonder how this scales with many (don't have a number 
 in mind) RPCs being open. Assuming all 16 threads are spawned, how many 
 RPCs can I operate?
 Assuming I have an application with multiple clients implemented, each 
 connecting to different servers.
 Would all the clients be able to share the same thread pool, or would 
 (in worst case) each client spawn 16 threads?

 Especially when designing microservices where each service offers a 
 server, but can be a client to another service it may be important to not 
 scale threads too much.

 Thanks

>>>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to grpc-io+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/b895ca46-420e-4de8-ba72-06bc50c76fffn%40googlegroups.com.


[grpc-io] Re: callback API and details on threading model

2023-03-23 Thread Timo
Hey Zach, I thought I answered this, but seems I missed. The question is 
about C++.

Naman Shah schrieb am Montag, 26. September 2022 um 05:31:14 UTC+2:

> Hey Zach, I have the same question about the implementation in CPP. 
>
> On Thursday, September 22, 2022 at 4:54:19 AM UTC+8 Zach Reyes wrote:
>
>> What language of gRPC? That'll allow me to route it to the correct person 
>> to answer.
>>
>> On Sunday, September 18, 2022 at 9:29:16 AM UTC-4 Timo wrote:
>>
>>> I did research on this topic but did not find detailed information in 
>>> the documentation yet.
>>> How exactly does the thread model of the new callback API work?
>>>
>>> When using the synchronous API, the thread model I guess is this:
>>> - grpc owns threads, number can be limited
>>> - Several RPCs can operate on one thread, but there's a limit
>>> - When too many RPCs are open, the client receives a "resource exhausted"
>>> - An application with multiple clients needs at least one thread per 
>>> each open RPC.
>>>
>>> In the callback (not asynchronous) API, I understand:
>>> - grpc owns threads and spawns new threads if needed
>>> - multiple RPCs can be handled on one thread non-blocking
>>> For the server, I wonder how this scales with many (don't have a number 
>>> in mind) RPCs being open. Assuming all 16 threads are spawned, how many 
>>> RPCs can I operate?
>>> Assuming I have an application with multiple clients implemented, each 
>>> connecting to different servers.
>>> Would all the clients be able to share the same thread pool, or would 
>>> (in worst case) each client spawn 16 threads?
>>>
>>> Especially when designing microservices where each service offers a 
>>> server, but can be a client to another service it may be important to not 
>>> scale threads too much.
>>>
>>> Thanks
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to grpc-io+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/8080f5e1-203c-44a9-8a6e-d2d28467456en%40googlegroups.com.


[grpc-io] Re: callback API and details on threading model

2022-09-25 Thread Naman Shah
Hey Zach, I have the same question about the implementation in CPP. 

On Thursday, September 22, 2022 at 4:54:19 AM UTC+8 Zach Reyes wrote:

> What language of gRPC? That'll allow me to route it to the correct person 
> to answer.
>
> On Sunday, September 18, 2022 at 9:29:16 AM UTC-4 Timo wrote:
>
>> I did research on this topic but did not find detailed information in the 
>> documentation yet.
>> How exactly does the thread model of the new callback API work?
>>
>> When using the synchronous API, the thread model I guess is this:
>> - grpc owns threads, number can be limited
>> - Several RPCs can operate on one thread, but there's a limit
>> - When too many RPCs are open, the client receives a "resource exhausted"
>> - An application with multiple clients needs at least one thread per each 
>> open RPC.
>>
>> In the callback (not asynchronous) API, I understand:
>> - grpc owns threads and spawns new threads if needed
>> - multiple RPCs can be handled on one thread non-blocking
>> For the server, I wonder how this scales with many (don't have a number 
>> in mind) RPCs being open. Assuming all 16 threads are spawned, how many 
>> RPCs can I operate?
>> Assuming I have an application with multiple clients implemented, each 
>> connecting to different servers.
>> Would all the clients be able to share the same thread pool, or would (in 
>> worst case) each client spawn 16 threads?
>>
>> Especially when designing microservices where each service offers a 
>> server, but can be a client to another service it may be important to not 
>> scale threads too much.
>>
>> Thanks
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to grpc-io+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/5dd5c7bf-d373-4258-b4e3-274f2bc1ca4fn%40googlegroups.com.


[grpc-io] Re: callback API and details on threading model

2022-09-21 Thread 'Zach Reyes' via grpc.io
What language of gRPC? That'll allow me to route it to the correct person 
to answer.

On Sunday, September 18, 2022 at 9:29:16 AM UTC-4 Timo wrote:

> I did research on this topic but did not find detailed information in the 
> documentation yet.
> How exactly does the thread model of the new callback API work?
>
> When using the synchronous API, the thread model I guess is this:
> - grpc owns threads, number can be limited
> - Several RPCs can operate on one thread, but there's a limit
> - When too many RPCs are open, the client receives a "resource exhausted"
> - An application with multiple clients needs at least one thread per each 
> open RPC.
>
> In the callback (not asynchronous) API, I understand:
> - grpc owns threads and spawns new threads if needed
> - multiple RPCs can be handled on one thread non-blocking
> For the server, I wonder how this scales with many (don't have a number in 
> mind) RPCs being open. Assuming all 16 threads are spawned, how many RPCs 
> can I operate?
> Assuming I have an application with multiple clients implemented, each 
> connecting to different servers.
> Would all the clients be able to share the same thread pool, or would (in 
> worst case) each client spawn 16 threads?
>
> Especially when designing microservices where each service offers a 
> server, but can be a client to another service it may be important to not 
> scale threads too much.
>
> Thanks
>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to grpc-io+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/81c4d4a1-e848-4862-84eb-5c06fd49ade9n%40googlegroups.com.