update - seems 'spark-shell' does not support mode -> yarn-cluster (i guess
since it is an interactive shell)

The only modes supported include -> yarn-client & local

Pls let me know if my understanding is incorrect.
Thanks!


On Sun, Aug 6, 2017 at 10:07 AM, karan alang <karan.al...@gmail.com> wrote:

> Hello all - i'd a basic question on the modes in which spark-shell can be
> run ..
>
> when i run the following command,
> does Spark run in local mode i.e. outside of YARN & using the local cores ?
> (since '--master' option is missing)
>
> ./bin/spark-shell --driver-memory 512m --executor-memory 512m
>
> Similarly, when i run the following -
>
> 1) ./bin/spark-shell --master yarn-client --driver-memory 512m
> --executor-memory 512m
>
>    - Spark is run in Client mode & resources managed by YARN.
>
> 2) ./bin/spark-shell --master yarn-cluster --driver-memory 512m
> --executor-memory 512m
>
>     - Spark is run in Cluster mode & resources managed by YARN.
>
>

Reply via email to