[ 
https://issues.apache.org/jira/browse/SPARK-21642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aki updated SPARK-21642:
------------------------
    Description: 
In current implementation, ip address of a driver host is set to 
DRIVER_HOST_ADDRESS [1]. This becomes a problem when we enable SSL using 
"spark.ssl.enabled", "spark.ssl.trustStore" and "spark.ssl.keyStore" 
properties. When we configure these properties, spark web ui is launched with 
SSL enabled and the HTTPS server is configured with the custom SSL certificate 
you configured in these properties.

In this case, client gets javax.net.ssl.SSLPeerUnverifiedException exception 
when the client accesses the spark web ui because the client fails to verify 
the SSL certificate (Common Name of the SSL cert does not match with 
DRIVER_HOST_ADDRESS).
To avoid the exception, we should use FQDN of the driver host for 
DRIVER_HOST_ADDRESS.

[1]  
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/package.scala#L222
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L942

Error message that client gets when the client accesses spark web ui:
javax.net.ssl.SSLPeerUnverifiedException: Certificate for <10.102.138.239> 
doesn't match any of the subject alternative names: []


{code:java}
$ spark-submit /path/to/jar
..
17/08/04 14:48:07 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
17/08/04 14:48:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://10.43.3.8:4040

$ curl -I http://10.43.3.8:4040
HTTP/1.1 302 Found
Date: Fri, 04 Aug 2017 14:48:20 GMT
Location: https://10.43.3.8:4440/
Content-Length: 0
Server: Jetty(9.2.z-SNAPSHOT)

$ curl -v https://10.43.3.8:4440
* Rebuilt URL to: https://10.43.3.8:4440/
*   Trying 10.43.3.8...
* TCP_NODELAY set
* Connected to 10.43.3.8 (10.43.3.8) port 4440 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
*   CAfile: /etc/pki/tls/certs/ca-bundle.crt
  CApath: none
* Server certificate:
*       subject: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
*       start date: Jun 12 00:05:02 2017 GMT
*       expire date: Jun 12 00:05:02 2018 GMT
*       common name: *.example.com
*       issuer: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
* NSS error -8172 (SEC_ERROR_UNTRUSTED_ISSUER)
* Peer's certificate issuer has been marked as not trusted by the user.
* Curl_http_done: called premature == 1
* Closing connection 0
curl: (60) Peer's certificate issuer has been marked as not trusted by the user.
More details here: https://curl.haxx.se/docs/sslcerts.html
{code}



  was:
In current implementation, ip address of a driver host is set to 
DRIVER_HOST_ADDRESS [1]. This becomes a problem when we enable SSL using 
"spark.ssl.enabled", "spark.ssl.trustStore" and "spark.ssl.keyStore" 
properties. When we configure these properties, spark web ui is launched with 
SSL enabled and the HTTPS server is configured with the custom SSL certificate 
you configured in these properties.

In this case, client gets javax.net.ssl.SSLPeerUnverifiedException exception 
when the client accesses the spark web ui because the client fails to verify 
the SSL certificate (Common Name of the SSL cert does not match with 
DRIVER_HOST_ADDRESS).
To avoid the exception, we should use FQDN of the driver host for 
DRIVER_HOST_ADDRESS.

[1]  
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/package.scala#L222
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L942

Error message that client gets when the client accesses spark web ui:
javax.net.ssl.SSLPeerUnverifiedException: Certificate for <10.102.138.239> 
doesn't match any of the subject alternative names: []


{code:java}
// Some comments here
public String getFoo()
$ spark-submit /path/to/jar
..
17/08/04 14:48:07 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
17/08/04 14:48:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://10.43.3.8:4040

$ curl -I http://10.43.3.8:4040
HTTP/1.1 302 Found
Date: Fri, 04 Aug 2017 14:48:20 GMT
Location: https://10.43.3.8:4440/
Content-Length: 0
Server: Jetty(9.2.z-SNAPSHOT)

$ curl -v https://10.43.3.8:4440
* Rebuilt URL to: https://10.43.3.8:4440/
*   Trying 10.43.3.8...
* TCP_NODELAY set
* Connected to 10.43.3.8 (10.43.3.8) port 4440 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
*   CAfile: /etc/pki/tls/certs/ca-bundle.crt
  CApath: none
* Server certificate:
*       subject: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
*       start date: Jun 12 00:05:02 2017 GMT
*       expire date: Jun 12 00:05:02 2018 GMT
*       common name: *.example.com
*       issuer: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
* NSS error -8172 (SEC_ERROR_UNTRUSTED_ISSUER)
* Peer's certificate issuer has been marked as not trusted by the user.
* Curl_http_done: called premature == 1
* Closing connection 0
curl: (60) Peer's certificate issuer has been marked as not trusted by the user.
More details here: https://curl.haxx.se/docs/sslcerts.html
{code}




> Use FQDN for DRIVER_HOST_ADDRESS instead of ip address
> ------------------------------------------------------
>
>                 Key: SPARK-21642
>                 URL: https://issues.apache.org/jira/browse/SPARK-21642
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.0, 2.2.0
>            Reporter: Aki
>
> In current implementation, ip address of a driver host is set to 
> DRIVER_HOST_ADDRESS [1]. This becomes a problem when we enable SSL using 
> "spark.ssl.enabled", "spark.ssl.trustStore" and "spark.ssl.keyStore" 
> properties. When we configure these properties, spark web ui is launched with 
> SSL enabled and the HTTPS server is configured with the custom SSL 
> certificate you configured in these properties.
> In this case, client gets javax.net.ssl.SSLPeerUnverifiedException exception 
> when the client accesses the spark web ui because the client fails to verify 
> the SSL certificate (Common Name of the SSL cert does not match with 
> DRIVER_HOST_ADDRESS).
> To avoid the exception, we should use FQDN of the driver host for 
> DRIVER_HOST_ADDRESS.
> [1]  
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/package.scala#L222
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L942
> Error message that client gets when the client accesses spark web ui:
> javax.net.ssl.SSLPeerUnverifiedException: Certificate for <10.102.138.239> 
> doesn't match any of the subject alternative names: []
> {code:java}
> $ spark-submit /path/to/jar
> ..
> 17/08/04 14:48:07 INFO Utils: Successfully started service 'SparkUI' on port 
> 4040.
> 17/08/04 14:48:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
> http://10.43.3.8:4040
> $ curl -I http://10.43.3.8:4040
> HTTP/1.1 302 Found
> Date: Fri, 04 Aug 2017 14:48:20 GMT
> Location: https://10.43.3.8:4440/
> Content-Length: 0
> Server: Jetty(9.2.z-SNAPSHOT)
> $ curl -v https://10.43.3.8:4440
> * Rebuilt URL to: https://10.43.3.8:4440/
> *   Trying 10.43.3.8...
> * TCP_NODELAY set
> * Connected to 10.43.3.8 (10.43.3.8) port 4440 (#0)
> * Initializing NSS with certpath: sql:/etc/pki/nssdb
> *   CAfile: /etc/pki/tls/certs/ca-bundle.crt
>   CApath: none
> * Server certificate:
> *     subject: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
> *     start date: Jun 12 00:05:02 2017 GMT
> *     expire date: Jun 12 00:05:02 2018 GMT
> *     common name: *.example.com
> *     issuer: CN=*.example.com,OU=MyDept,O=MyOrg,L=Area,C=US
> * NSS error -8172 (SEC_ERROR_UNTRUSTED_ISSUER)
> * Peer's certificate issuer has been marked as not trusted by the user.
> * Curl_http_done: called premature == 1
> * Closing connection 0
> curl: (60) Peer's certificate issuer has been marked as not trusted by the 
> user.
> More details here: https://curl.haxx.se/docs/sslcerts.html
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to