[ 
https://issues.apache.org/jira/browse/SPARK-25078?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17777377#comment-17777377
 ] 

Yaroslav commented on SPARK-25078:
----------------------------------

Hi, this issue is still reproducible. In SPARK-8129  they changed the way 
Worker sends "spark.authenticate.secret" value to Driver from Java options to 
environment variable to be more secure (because other processes can freely view 
this java option while only the process owner can see its environment 
variables). So the sender should 
[add|https://github.com/apache/spark/blob/v3.5.0/core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala#L89-L92]
 the value to environment and the receiver should take it from there, not from 
spark config. They have created this universal method [getSecretKey 
|https://github.com/apache/spark/blob/v3.5.0/core/src/main/scala/org/apache/spark/SecurityManager.scala#L282-L307]which
 can get the value either from config or from env. But for some reason inside 
initializeAuth() they still 
[search|https://github.com/apache/spark/blob/v3.5.0/core/src/main/scala/org/apache/spark/SecurityManager.scala#L337]
 this key in spark config, which fails and throws such error. Doing such change 
would fix that and I suppose getSecretKey method was created exactly for such 
kind of use:

 
{code:java}
-        require(sparkConf.contains(SPARK_AUTH_SECRET_CONF),
+        require(getSecretKey() != null, {code}
I guess it won't affect anything since even if key is in the config and not in 
the environment, this method will still try to search there and return the 
value. Whilst searching only in config does not cover all cases.

So [~irashid] , [~maropu] could you please review status of this issue since 
it's Marked as Resolved (Incomplete) while the error is still easily 
reproducible and easily fixable as well?

Thanks!

 

 

> Standalone does not work with spark.authenticate.secret and 
> deploy-mode=cluster
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-25078
>                 URL: https://issues.apache.org/jira/browse/SPARK-25078
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 2.4.0
>            Reporter: Imran Rashid
>            Priority: Major
>              Labels: bulk-closed
>
> When running a spark standalone cluster with spark.authenticate.secret setup, 
> you cannot submit a program in cluster mode, even with the right secret.  The 
> driver fails with:
> {noformat}
> 18/08/09 08:17:21 INFO SecurityManager: SecurityManager: authentication 
> enabled; ui acls disabled; users  with view permissions: Set(systest); groups 
> with view permissions: Set(); users  with modify permissions: Set(systest); 
> groups with modify permissions: Set()
> 18/08/09 08:17:21 ERROR SparkContext: Error initializing SparkContext.
> java.lang.IllegalArgumentException: requirement failed: A secret key must be 
> specified via the spark.authenticate.secret config.
>         at scala.Predef$.require(Predef.scala:224)
>         at 
> org.apache.spark.SecurityManager.initializeAuth(SecurityManager.scala:361)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:238)
>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
>         at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
> ...
> {noformat}
> but its actually doing the wrong check in 
> {{SecurityManager.initializeAuth()}}.  The secret is there, its just in an 
> environment variable {{_SPARK_AUTH_SECRET}} (so its not visible to another 
> process).
> *Workaround*: In your program, you can pass in a dummy secret to your spark 
> conf.  It doesn't matter what it is at all, later it'll be ignored and when 
> establishing connections, the secret from the env variable will be used.  Eg.
> {noformat}
> val conf = new SparkConf()
> conf.setIfMissing("spark.authenticate.secret", "doesn't matter")
> val sc = new SparkContext(conf)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to