Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-12 Thread Yan Fang
Thank you, Tathagata. That explains. Fang, Yan yanfang...@gmail.com +1 (206) 849-4108 On Fri, Jul 11, 2014 at 7:21 PM, Tathagata Das wrote: > Task slot is equivalent to core number. So one core can only run one task > at a time. > > TD > > > On Fri, Jul 11, 2014 at 1:57 PM, Yan Fang wrote: >

Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-11 Thread Tathagata Das
Task slot is equivalent to core number. So one core can only run one task at a time. TD On Fri, Jul 11, 2014 at 1:57 PM, Yan Fang wrote: > Hi Tathagata, > > Thank you. Is task slot equivalent to the core number? Or actually one > core can run multiple tasks at the same time? > > Best, > > Fang

Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-11 Thread Yan Fang
Hi Tathagata, Thank you. Is task slot equivalent to the core number? Or actually one core can run multiple tasks at the same time? Best, Fang, Yan yanfang...@gmail.com +1 (206) 849-4108 On Fri, Jul 11, 2014 at 1:45 PM, Tathagata Das wrote: > The same executor can be used for both receiving a

Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-11 Thread Tathagata Das
The same executor can be used for both receiving and processing, irrespective of the deployment mode (yarn, spark standalone, etc.) It boils down to the number of cores / task slots that executor has. Each receiver is like a long running task, so each of them occupy a slot. If there are free slots

Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-11 Thread Yan Fang
Hi Praveen, Thank you for the answer. That's interesting because if I only bring up one executor for the Spark Streaming, it seems only the receiver is working, no other tasks are happening, by checking the log and UI. Maybe it's just because the receiving task eats all the resource?, not because

Re: How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-11 Thread Praveen Seluka
Here are my answers. But am just getting started with Spark Streaming - so please correct me if am wrong. 1) Yes 2) Receivers will run on executors. Its actually a job thats submitted where # of tasks equals # of receivers. An executor can actually run more than one task at the same time. Hence you

How are the executors used in Spark Streaming in terms of receiver and driver program?

2014-07-10 Thread Yan Fang
Hi all, I am working to improve the parallelism of the Spark Streaming application. But I have problem in understanding how the executors are used and the application is distributed. 1. In YARN, is one executor equal one container? 2. I saw the statement that a streaming receiver runs on one wor