[ 
https://issues.apache.org/jira/browse/SPARK-22778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16289818#comment-16289818
 ] 

Anirudh Ramanathan edited comment on SPARK-22778 at 12/13/17 8:23 PM:
----------------------------------------------------------------------

I submitted it as `k8s://https://xx.yy.zz.ww` to spark submit. However, it 
seems there is some change in how the validation of said URL occurs on the 
client-side - which makes us strip out the k8s and add it back in the above 
format. That might be at fault here. 

Here's my full spark-submit command:

bin/spark-submit \
  --deploy-mode cluster \
  --class org.apache.spark.examples.SparkPi \
  --master k8s://https://xx.yy.zz.ww \
  --conf spark.executor.instances=5 \
  --conf spark.app.name=spark-pi \
  --conf 
spark.kubernetes.driver.docker.image=foxish/spark-driver:spark-k8s-master-13dec-11-56
 \
  --conf 
spark.kubernetes.executor.docker.image=foxish/spark-executor:spark-k8s-master-13dec-11-56
 \
  local:///opt/spark/examples/jars/spark-examples_2.11-2.3.0-SNAPSHOT.jar


was (Author: foxish):
I submitted it as `k8s://https://xx.yy.zz.ww` to spark submit. However, it 
seems there is some change in how the validation of said URL occurs on the 
client-side - which makes us strip out the k8s and add it back in the above 
format. That might be at fault here. 

> Kubernetes scheduler at master failing to run applications successfully
> -----------------------------------------------------------------------
>
>                 Key: SPARK-22778
>                 URL: https://issues.apache.org/jira/browse/SPARK-22778
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.3.0
>            Reporter: Anirudh Ramanathan
>
> Building images based on master and deploying Spark PI results in the 
> following error.
> 2017-12-13 19:57:19 INFO  SparkContext:54 - Successfully stopped SparkContext
> Exception in thread "main" org.apache.spark.SparkException: Could not parse 
> Master URL: 'k8s:https://xx.yy.zz.ww'
>       at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2741)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:496)
>       at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2490)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:927)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:918)
>       at scala.Option.getOrElse(Option.scala:121)
>       at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:918)
>       at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
>       at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
> 2017-12-13 19:57:19 INFO  ShutdownHookManager:54 - Shutdown hook called
> 2017-12-13 19:57:19 INFO  ShutdownHookManager:54 - Deleting directory 
> /tmp/spark-b47515c2-6750-4a37-aa68-6ee12da5d2bd
> This is likely an artifact seen because of changes in master, or our 
> submission code in the reviews. We haven't seen this on our fork. Hopefully 
> once integration tests are ported against upstream/master, we will catch 
> these issues earlier. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to