Hi, Tarek,

It is hard to answer your question. Are these requests similar? Caching
your results or intermediate results in your applications? Or does that
mean your throughput requirement is very high? Throttling the number of
concurrent requests? ...

As Akhil said, Kafka might help in your case. Otherwise, you need to read
the designs or even source codes of Kafka and Spark Streaming.

 Best wishes,

Xiao Li


2015-10-11 23:19 GMT-07:00 Akhil Das <ak...@sigmoidanalytics.com>:

> Instead of pushing your requests to the socket, why don't you push them to
> a Kafka or any other message queue and use spark streaming to process them?
>
> Thanks
> Best Regards
>
> On Mon, Oct 5, 2015 at 6:46 PM, <tarek.abouzei...@yahoo.com.invalid>
> wrote:
>
>> Hi ,
>> i am using Scala , doing a socket program to catch multiple requests at
>> same time and then call a function which uses spark to handle each process
>> , i have a multi-threaded server to handle the multiple requests and pass
>> each to spark , but there's a bottleneck as the spark doesn't initialize a
>> sub task for the new request , is it even possible to do parallel
>> processing using single spark job ?
>> Best Regards,
>>
>> --  Best Regards, -- Tarek Abouzeid
>>
>
>

Reply via email to