Instead of pushing your requests to the socket, why don't you push them to
a Kafka or any other message queue and use spark streaming to process them?

Thanks
Best Regards

On Mon, Oct 5, 2015 at 6:46 PM, <tarek.abouzei...@yahoo.com.invalid> wrote:

> Hi ,
> i am using Scala , doing a socket program to catch multiple requests at
> same time and then call a function which uses spark to handle each process
> , i have a multi-threaded server to handle the multiple requests and pass
> each to spark , but there's a bottleneck as the spark doesn't initialize a
> sub task for the new request , is it even possible to do parallel
> processing using single spark job ?
> Best Regards,
>
> --  Best Regards, -- Tarek Abouzeid
>

Reply via email to