Hi ,
i am using Scala , doing a socket program to catch multiple requests at same 
time and then call a function which uses spark to handle each process , i have 
a multi-threaded server to handle the multiple requests and pass each to spark 
, but there's a bottleneck as the spark doesn't initialize a sub task for the 
new request , is it even possible to do parallel processing using single spark 
job ?Best Regards, --  Best Regards, -- Tarek Abouzeid

Reply via email to