>>> The "--master" should override any other ways of setting the Spark
master.

Ah yes, actually you can set "spark.master" directly in your application
through SparkConf. Thanks Marcelo.


2014-08-19 14:47 GMT-07:00 Marcelo Vanzin <van...@cloudera.com>:

> On Tue, Aug 19, 2014 at 2:34 PM, Arun Ahuja <aahuj...@gmail.com> wrote:
> > /opt/cloudera/parcels/CDH/bin/spark-submit \
> >     --master yarn \
> >     --deploy-mode client \
>
> This should be enough.
>
> > But when I view the job 4040 page, SparkUI, there is a single executor
> (just
> > the driver node) and I see the following in enviroment
> >
> > spark.master - local[24]
>
> Hmmm. Are you sure the app itself is not overwriting "spark.master"
> before creating the SparkContext? That's the only explanation I can
> think of.
>
> > Also, when I run with yarn-cluster, how can I access the SparkUI page?
>
> You can click on the link in the RM application list. The address is
> also printed to the AM logs, which are also available through the RM
> web ui. Finally, the link is printed to the output of the launcher
> process (look for "appTrackingUrl").
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to