[jira] [Updated] (SPARK-26324) Spark submit does not work with messos over ssl [Missing docs]

2018-12-11 Thread Jorge Machado (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26324?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jorge Machado updated SPARK-26324:
--
Summary: Spark submit does not work with messos over ssl [Missing docs]  
(was: Spark submit does not work with messos over ssl)

> Spark submit does not work with messos over ssl [Missing docs]
> --
>
> Key: SPARK-26324
> URL: https://issues.apache.org/jira/browse/SPARK-26324
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit
>Affects Versions: 2.4.0
>Reporter: Jorge Machado
>Priority: Major
>
> Hi guys, 
> I was trying to run the examples on a mesos cluster that uses https. I tried 
> with rest endpoint: 
>  
> {code:java}
> ./spark-submit  --class org.apache.spark.examples.SparkPi --master 
> mesos://:5050 --conf spark.master.rest.enabled=true 
> --deploy-mode cluster --supervise --executor-memory 10G 
> --total-executor-cores 100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
> {code}
> The error that I get on the host where I started the spark-submit is:
>  
>  
> {code:java}
> 2018-12-10 15:08:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-12-10 15:08:39 INFO RestSubmissionClient:54 - Submitting a request to 
> launch an application in mesos://:5050.
> 2018-12-10 15:08:39 WARN RestSubmissionClient:66 - Unable to connect to 
> server mesos://:5050.
> Exception in thread "main" 
> org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable to connect 
> to server
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:104)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:86)
> at 
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> at 
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
> at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.createSubmission(RestSubmissionClient.scala:86)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClientApp.run(RestSubmissionClient.scala:443)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClientApp.start(RestSubmissionClient.scala:455)
> at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
> at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable 
> to connect to server
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.readResponse(RestSubmissionClient.scala:281)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient.org$apache$spark$deploy$rest$RestSubmissionClient$$postJson(RestSubmissionClient.scala:225)
> at 
> org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:90)
> ... 15 more
> Caused by: java.net.SocketException: Connection reset
> {code}
> I'm pretty sure this is because of the hardcoded http:// here:
>  
>  
> {code:java}
> RestSubmissionClient.scala
> /** Return the base URL for communicating with the server, including the 
> protocol version. */
> private def getBaseUrl(master: String): String = {
>   var masterUrl = master
>   supportedMasterPrefixes.foreach { prefix =>
> if (master.startsWith(prefix)) {
>   masterUrl = master.stripPrefix(prefix)
> }
>   }
>   masterUrl = masterUrl.stripSuffix("/")
>   s"http://$masterUrl/$PROTOCOL_VERSION/submissions; <--- hardcoded http
> }
> {code}
>  
> Then I tried without the _--deploy-mode cluster_ and I get: 
>  
> {code:java}
> ./spark-submit  --class org.apache.spark.examples.SparkPi --master 
> mesos://:5050  --supervise --executor-memory 10G 
> --total-executor-cores 100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
> {code}
>  
> On the spark console I get: 
>  
> {code:java}
> 2018-12-10 15:01:05 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started 
> at http://_host:4040
> 2018-12-10 15:01:05 INFO SparkContext:54 - Added JAR 
> file:/home//spark-2.4.0-bin-hadoop2.7/bin/../examples/jars/spark-examples_2.11-2.4.0.jar
>  at 

[jira] [Updated] (SPARK-26324) Spark submit does not work with messos over ssl [Missing docs]

2018-12-11 Thread Jorge Machado (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26324?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jorge Machado updated SPARK-26324:
--
Description: 
Hi guys, 

I was trying to run the examples on a mesos cluster that uses https. I tried 
with rest endpoint: 
{code:java}
./spark-submit  --class org.apache.spark.examples.SparkPi --master 
mesos://:5050 --conf spark.master.rest.enabled=true 
--deploy-mode cluster --supervise --executor-memory 10G --total-executor-cores 
100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
{code}
The error that I get on the host where I started the spark-submit is:
{code:java}
2018-12-10 15:08:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
2018-12-10 15:08:39 INFO RestSubmissionClient:54 - Submitting a request to 
launch an application in mesos://:5050.
2018-12-10 15:08:39 WARN RestSubmissionClient:66 - Unable to connect to server 
mesos://:5050.
Exception in thread "main" 
org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable to connect 
to server
at 
org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:104)
at 
org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:86)
at 
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at 
org.apache.spark.deploy.rest.RestSubmissionClient.createSubmission(RestSubmissionClient.scala:86)
at 
org.apache.spark.deploy.rest.RestSubmissionClientApp.run(RestSubmissionClient.scala:443)
at 
org.apache.spark.deploy.rest.RestSubmissionClientApp.start(RestSubmissionClient.scala:455)
at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.deploy.rest.SubmitRestConnectionException: Unable 
to connect to server
at 
org.apache.spark.deploy.rest.RestSubmissionClient.readResponse(RestSubmissionClient.scala:281)
at 
org.apache.spark.deploy.rest.RestSubmissionClient.org$apache$spark$deploy$rest$RestSubmissionClient$$postJson(RestSubmissionClient.scala:225)
at 
org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:90)
... 15 more
Caused by: java.net.SocketException: Connection reset
{code}
I'm pretty sure this is because of the hardcoded http:// here:

 

 
{code:java}
RestSubmissionClient.scala
/** Return the base URL for communicating with the server, including the 
protocol version. */
private def getBaseUrl(master: String): String = {
  var masterUrl = master
  supportedMasterPrefixes.foreach { prefix =>
if (master.startsWith(prefix)) {
  masterUrl = master.stripPrefix(prefix)
}
  }
  masterUrl = masterUrl.stripSuffix("/")
  s"http://$masterUrl/$PROTOCOL_VERSION/submissions; <--- hardcoded http
}
{code}
Then I tried without the _--deploy-mode cluster_ and I get: 
{code:java}
./spark-submit  --class org.apache.spark.examples.SparkPi --master 
mesos://:5050  --supervise --executor-memory 10G 
--total-executor-cores 100 ../examples/jars/spark-examples_2.11-2.4.0.jar 1000
 {code}
On the spark console I get: 
{code:java}
2018-12-10 15:01:05 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at 
http://_host:4040
2018-12-10 15:01:05 INFO SparkContext:54 - Added JAR 
file:/home//spark-2.4.0-bin-hadoop2.7/bin/../examples/jars/spark-examples_2.11-2.4.0.jar
 at spark://_host:35719/jars/spark-examples_2.11-2.4.0.jar with timestamp 
1544450465799
I1210 15:01:05.963078 37943 sched.cpp:232] Version: 1.3.2
I1210 15:01:05.966814 37911 sched.cpp:336] New master detected at 
master@53.54.195.251:5050
I1210 15:01:05.967010 37911 sched.cpp:352] No credentials provided. Attempting 
to register without authentication
E1210 15:01:05.967347 37942 process.cpp:2455] Failed to shutdown socket with fd 
307, address 53.54.195.251:45206: Transport endpoint is not connected
E1210 15:01:05.968212 37942 process.cpp:2369] Failed to shutdown socket with fd 
307, address 53.54.195.251:45212: Transport endpoint is not connected
E1210 15:01:05.969405 37942 process.cpp:2455] Failed to shutdown socket with fd 
307, address 53.54.195.251:45222: Transport endpoint is not