Re: Spark not using all the cluster instances in AWS EMR

2016-06-18 Thread Akhil Das
spark.executor.instances is the parameter that you are looking for. Read
more here http://spark.apache.org/docs/latest/running-on-yarn.html

On Sun, Jun 19, 2016 at 2:17 AM, Natu Lauchande 
wrote:

> Hi,
>
> I am running some spark loads . I notice that in  it only uses one of the
> machines(instead of the 3 available) of the cluster.
>
> Is there any parameter that can be set to force it to use all the cluster.
>
> I am using AWS EMR with Yarn.
>
>
> Thanks,
> Natu
>
>
>
>
>
>
>


-- 
Cheers!


Spark not using all the cluster instances in AWS EMR

2016-06-18 Thread Natu Lauchande
Hi,

I am running some spark loads . I notice that in  it only uses one of the
machines(instead of the 3 available) of the cluster.

Is there any parameter that can be set to force it to use all the cluster.

I am using AWS EMR with Yarn.


Thanks,
Natu