spark.executor.instances is the parameter that you are looking for. Read
more here http://spark.apache.org/docs/latest/running-on-yarn.html
On Sun, Jun 19, 2016 at 2:17 AM, Natu Lauchande
wrote:
> Hi,
>
> I am running some spark loads . I notice that in it only uses one of the
> machines(instea
Hi,
I am running some spark loads . I notice that in it only uses one of the
machines(instead of the 3 available) of the cluster.
Is there any parameter that can be set to force it to use all the cluster.
I am using AWS EMR with Yarn.
Thanks,
Natu