In my use case, - I have a queue (a proxy with ROUTER in the front and DEALER in the back). - Client has a DEALER socket that connects to the ROUTER in the proxy above. not using as I want the client to be able to send multiple jobs without waiting for a response - Worker has a REP socket that connects to the DEALER in the proxy above - this is working great, however, a design goal is to not let 1 client block other clients from sending and receiving responses which is what is happening at the moment. If the first client sends a lot of requests BEFORE the second client started sending its requests then first all the requests of the first client are served and then the second.
How can I improve this design so if 1st client sent 10000 requests and then 2nd came along and sent 1 request, it doesn't have to wait for the 10000 requests to finish? I guess one solution would be to lower the high watermark so the 1st client would be blocked from sending more but feel that would hurt performance/scalability. thanks Nishant
_______________________________________________ zeromq-dev mailing list [email protected] http://lists.zeromq.org/mailman/listinfo/zeromq-dev
