Hi, In a local mode, spark runs in a single JVM that has a master and one executor with `k` threads. https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94
// maropu On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar <ashok34...@yahoo.com.invalid> wrote: > Hi, > > I have been told Spark in Local mode is simplest for testing. Spark > document covers little on local mode except the cores used in --master > local[k]. > > Where are the the driver program, executor and resources. Do I need to > start worker threads and how many app I can use safely without exceeding > memory allocated etc? > > Thanking you > > > -- --- Takeshi Yamamuro