[ 
https://issues.apache.org/jira/browse/SPARK-26324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16723761#comment-16723761
 ] 

Jorge Machado commented on SPARK-26324:
---------------------------------------

[~hyukjin.kwon] I created a PR for this docs

> Spark submit does not work with messos over ssl [Missing docs]
> --------------------------------------------------------------
>
>                 Key: SPARK-26324
>                 URL: https://issues.apache.org/jira/browse/SPARK-26324
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.4.0
>            Reporter: Jorge Machado
>            Priority: Major
>
> Hi guys, 
> I was trying to run the examples on a mesos cluster that uses https. I tried 
> with rest endpoint: 
> {code:java}
> ./spark-submit  --class org.apache.spark.examples.SparkPi --master 
> mesos://<mesos_master_with_https>:5050 --conf spark.master.rest.enabled=true 
> --deploy-mode cluster --supervise --executor-memory 10G 
> --total-executor-cores 100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
> {code}
> The error that I get on the host where I started the spark-submit is:
> {code:java}
> 2018-12-10 15:08:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-12-10 15:08:39 INFO RestSubmissionClient:54 - Submitting a request to 
> launch an application in mesos://<mesos_master_with_https>:5050.
> 2018-12-10 15:08:39 WARN RestSubmissionClient:66 - Unable to connect to 
> server mesos://<mesos_master_with_https>:5050.
> Exception in thread "main" 
> org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable to connect 
> to server
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:104)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:86)
> at 
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> at 
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
> at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.createSubmission(RestSubmissionClient.scala:86)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClientApp.run(RestSubmissionClient.scala:443)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClientApp.start(RestSubmissionClient.scala:455)
> at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
> at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable 
> to connect to server
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.readResponse(RestSubmissionClient.scala:281)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.org$apache$spark$deploy$rest$RestSubmissionClient$$postJson(RestSubmissionClient.scala:225)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:90)
> ... 15 more
> Caused by: java.net.SocketException: Connection reset
> {code}
> I'm pretty sure this is because of the hardcoded http:// here:
>  
>  
> {code:java}
> RestSubmissionClient.scala
> /** Return the base URL for communicating with the server, including the 
> protocol version. */
> private def getBaseUrl(master: String): String = {
>   var masterUrl = master
>   supportedMasterPrefixes.foreach { prefix =>
>     if (master.startsWith(prefix)) {
>       masterUrl = master.stripPrefix(prefix)
>     }
>   }
>   masterUrl = masterUrl.stripSuffix("/")
>   s"http://$masterUrl/$PROTOCOL_VERSION/submissions"; <--- hardcoded http
> }
> {code}
> Then I tried without the _--deploy-mode cluster_ and I get: 
> {code:java}
> ./spark-submit  --class org.apache.spark.examples.SparkPi --master 
> mesos://<server_using_https>:5050  --supervise --executor-memory 10G 
> --total-executor-cores 100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
>  {code}
> On the spark console I get: 
> {code:java}
> 2018-12-10 15:01:05 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started 
> at http://_host:4040
> 2018-12-10 15:01:05 INFO SparkContext:54 - Added JAR 
> file:/home/<user>/spark-2.4.0-bin-hadoop2.7/bin/../examples/jars/spark-examples_2.11-2.4.0.jar
>  at spark://_host:35719/jars/spark-examples_2.11-2.4.0.jar with timestamp 
> 1544450465799
> I1210 15:01:05.963078 37943 sched.cpp:232] Version: 1.3.2
> I1210 15:01:05.966814 37911 sched.cpp:336] New master detected at 
> master@53.54.195.251:5050
> I1210 15:01:05.967010 37911 sched.cpp:352] No credentials provided. 
> Attempting to register without authentication
> E1210 15:01:05.967347 37942 process.cpp:2455] Failed to shutdown socket with 
> fd 307, address 53.54.195.251:45206: Transport endpoint is not connected
> E1210 15:01:05.968212 37942 process.cpp:2369] Failed to shutdown socket with 
> fd 307, address 53.54.195.251:45212: Transport endpoint is not connected
> E1210 15:01:05.969405 37942 process.cpp:2455] Failed to shutdown socket with 
> fd 307, address 53.54.195.251:45222: Transport endpoint is not connected{code}
> On Mesos I get:  
> {code:java}
> E1210 15:01:06.665076  2633 process.cpp:956] Failed to accept socket: Failed 
> accept: connection error: error:1407609C:SSL 
> routines:SSL23_GET_CLIENT_HELLO:http request
> {code}
> I could not found any documentation on how to connect both. Do I need to 
> setup some acls on java_opts for the ssl ?
>  Ok, after setting this envs it worked out: 
> {code:java}
> LIBPROCESS_SSL_VERIFY_CERT=false
> LIBPROCESS_SSL_KEY_FILE=/home/machjor/server_2048.key
> LIBPROCESS_SSL_ENABLED=true
> LIBPROCESS_SSL_CERT_FILE=/home/machjor/server.crt
> {code}
>  *should we update the spark docs ?*



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to