though.
>
>
>
> Mohammed
>
>
>
> *From:* Yiannis Gkoufas [mailto:johngou...@gmail.com]
> *Sent:* Friday, February 20, 2015 9:48 AM
> *To:* Mohammed Guller
> *Cc:* user@spark.apache.org
> *Subject:* Re: Setting the number of executors in standalone mode
>
>
>
> Hi Mo
: Re: Setting the number of executors in standalone mode
Hi Mohammed,
thanks a lot for the reply.
Ok, so from what I understand I cannot control the number of executors per
worker in standalone cluster mode.
Is that correct?
BR
On 20 February 2015 at 17:46, Mohammed Guller
mailto:moham
SPARK_WORKER_MEMORY=8g
Will allocate 8GB memory to Spark on each worker node. Nothing to do with # of
executors.
Mohammed
From: Yiannis Gkoufas [mailto:johngou...@gmail.com]
Sent: Friday, February 20, 2015 4:55 AM
To: user@spark.apache.org
Subject: Setting the number of executors in standalone
Spark on each worker node. Nothing to do with
> # of executors.
>
>
>
>
>
> Mohammed
>
>
>
> *From:* Yiannis Gkoufas [mailto:johngou...@gmail.com]
> *Sent:* Friday, February 20, 2015 4:55 AM
> *To:* user@spark.apache.org
> *Subject:* Setting the numb
Hi there,
I try to increase the number of executors per worker in the standalone mode
and I have failed to achieve that.
I followed a bit the instructions of this thread:
http://stackoverflow.com/questions/26645293/spark-configuration-memory-instance-cores
and did that:
spark.executor.memory 1g
S