Hi,

I think for local mode, the number N (N number of thread) basically equals
to N number of available cores in ONE executor(worker), not N workers. You
could image local[N] as have one worker with N cores. I'm not sure you
could set the memory usage for each thread, for Spark the memory is shared
in one executor.

Thanks
Jerry


2015-03-30 4:21 GMT+08:00 FreePeter <wenlei....@gmail.com>:

> Hi,
>
> I am trying to use Spark for my own applications, and I am currently
> profiling the performance with local mode, and I have a couple of
> questions:
>
> 1. When I set spark.master local[N], it means the will use up to N worker
> *threads* on the single machine. Is this equivalent to say there are N
> worker *nodes*  as described in
> http://spark.apache.org/docs/latest/cluster-overview.html
> (So each worker node/thread are viewed separately and can have its own
> executor for each application)
>
> 2. Is there anyway to set up the max memory used by each worker
> thread/node?
> I only find we can set the memory for each executor? (spark.executor.mem)
>
> Thank you!
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to