Hi Jun,

You can still set the authentication variables through `spark-env.sh`, by
exporting SPARK_MASTER_OPTS, SPARK_WORKER_OPTS, SPARK_HISTORY_OPTS etc to
include "-Dspark.auth.{...}". There is an open pull request that allows
these processes to also read from spark-defaults.conf, but this is not
merged into master yet.

Andrew

2014-09-15 6:44 GMT-07:00 Tom Graves <tgraves...@yahoo.com.invalid>:

> Spark authentication does work in standalone mode (atleast it did, I
> haven't tested it in a while). The same shared secret has to be set on all
> the daemons (master and workers) and then also in the configs of any
> applications submitted.  Since everyone shares the same secret its by no
> means ideal or a strong authentication.
>
> Tom
>
>
> On Thursday, September 11, 2014 4:17 AM, Jun Feng Liu <liuj...@cn.ibm.com>
> wrote:
>
>
>
> Hi, there
>
> I am trying to enable the authentication
> on spark on standealone model. Seems like only SparkSubmit load the
> properties
> from spark-defaults.conf.  org.apache.spark.deploy.master.Master dose
> not really load the default setting from spark-defaults.conf.
>
> Dose it mean the spark authentication
> only work for like YARN model? Or I missed something with standalone model.
>
> Best Regards
>
> Jun Feng Liu
> IBM China Systems & Technology Laboratory in Beijing
>
> ________________________________
>
>   Phone: 86-10-82452683
> E-mail:liuj...@cn.ibm.com
>
> BLD 28,ZGC Software Park
> No.8 Rd.Dong Bei Wang West, Dist.Haidian Beijing 100193
> China
>

Reply via email to