[ 
https://issues.apache.org/jira/browse/SPARK-38862?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17526358#comment-17526358
 ] 

Jack commented on SPARK-38862:
------------------------------

Further to investigating an implementation for this, I found in the master 
implementation, there is an explicit require to ensure that we are not 
attempting to run with auth/encryption whilst running the REST submission 
server; I would consider an option that allows me to run the rest submission 
server whilst the RPC channels are using AES, granted I make it clear this is 
the desired result and I am aware I must handle the implications of that. This 
would allow me to sandbox the Spark rest server and provide my own 
authenticated and encrypted proxy on top if this feature will not be considered.

For reference the following exception is thrown due to the require in 
master.scala


22/04/20 22:07:53 ERROR SparkUncaughtExceptionHandler: Uncaught exception in 
thread Thread[main,5,main]
java.lang.IllegalArgumentException: requirement failed: The 
RestSubmissionServer does not support authentication via 
spark.authenticate.secret.  Either turn off the RestSubmissionServer with 
spark.master.rest.enabled=false, or do not use authentication.
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.deploy.master.Master.<init>(Master.scala:137)
        at 
org.apache.spark.deploy.master.Master$.startRpcEnvAndEndpoint(Master.scala:1229)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:1211)
        at org.apache.spark.deploy.master.Master.main(Master.scala)

> Basic Authentication or Token Based Authentication for The REST Submission 
> Server
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-38862
>                 URL: https://issues.apache.org/jira/browse/SPARK-38862
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core, Spark Submit
>    Affects Versions: 3.0.0, 3.0.1, 3.0.2, 3.0.3, 3.1.0, 3.1.1, 3.1.2, 3.2.0, 
> 3.2.1
>            Reporter: Jack
>            Priority: Major
>              Labels: authentication, rest, spark, spark-submit, submit
>
> [Spark documentation|https://spark.apache.org/docs/latest/security.html] 
> states that
> ??The REST Submission Server and the MesosClusterDispatcher do not support 
> authentication. You should ensure that all network access to the REST API & 
> MesosClusterDispatcher (port 6066 and 7077 respectively by default) are 
> restricted to hosts that are trusted to submit jobs.??
> Whilst it is true that we can use network policies to restrict access to our 
> exposed submission endpoint, it would be preferable to at least also allow 
> some primitive form of authentication at a global level, whether this is by 
> some token provided to the runtime environment or is a "system user" using 
> basic authentication of a username/password combination - I am not strictly 
> opinionated and I think either would suffice.
> I appreciate that one could implement a custom proxy to provide this 
> authentication check, but it seems like a common use case that others may 
> benefit from to be able to authenticate against the rest submission endpoint, 
> and by implementing this capability as an optionally configurable aspect of 
> Spark itself, we can utilise the existing server to provide this check.
> I would imagine that whatever solution is agreed for a first phase, a custom 
> authenticator may be something we want a user to be able to provide so that 
> if an admin needed some more advanced authentication check, such as RBAC et 
> al, it could be facilitated without the need for writing a complete custom 
> proxy layer; but I do feel there should be some basic built in available; eg. 
> RestSubmissionBasicAuthenticator.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to