100193
> China
>
>
>
>
>
> *Andrew Or >*
>
> 2014/09/17 02:06
> To
> Tom Graves ,
> cc
> Jun Feng Liu/China/IBM@IBMCN, "dev@spark.apache.org" >
> Subject
> Re: Spark authenticate enablement
>
>
>
>
> Hi Jun,
>
> Y
West, Dist.Haidian Beijing 100193
China
Andrew Or
2014/09/17 02:06
To
Tom Graves ,
cc
Jun Feng Liu/China/IBM@IBMCN, "dev@spark.apache.org"
Subject
Re: Spark authenticate enablement
Hi Jun,
You can still set the authentication variables through `spark-env.sh`
Hi Jun,
You can still set the authentication variables through `spark-env.sh`, by
exporting SPARK_MASTER_OPTS, SPARK_WORKER_OPTS, SPARK_HISTORY_OPTS etc to
include "-Dspark.auth.{...}". There is an open pull request that allows
these processes to also read from spark-defaults.conf, but this is not
Spark authentication does work in standalone mode (atleast it did, I haven't
tested it in a while). The same shared secret has to be set on all the daemons
(master and workers) and then also in the configs of any applications
submitted. Since everyone shares the same secret its by no means idea
Hi Jun,
I believe that's correct that Spark authentication only works against YARN.
-Sandy
On Thu, Sep 11, 2014 at 2:14 AM, Jun Feng Liu wrote:
> Hi, there
>
> I am trying to enable the authentication on spark on standealone model.
> Seems like only SparkSubmit load the properties from spark-d
Hi, there
I am trying to enable the authentication on spark on standealone model.
Seems like only SparkSubmit load the properties from spark-defaults.conf.
org.apache.spark.deploy.master.Master dose not really load the default
setting from spark-defaults.conf.
Dose it mean the spark authentic