Andreas Adamides created SPARK-27059:
----------------------------------------

             Summary: spark-submit on kubernetes cluster does not recognise k8s 
--master property
                 Key: SPARK-27059
                 URL: https://issues.apache.org/jira/browse/SPARK-27059
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 2.4.0, 2.3.3
            Reporter: Andreas Adamides


I have successfully installed a Kubernetes cluster and can verify this by:

 

 

{{C:\windows\system32>kubectl cluster-info Kubernetes master is running at 
https://<ip>:<port> KubeDNS is running at 
https://<ip>:<port>/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

 

Then I am trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

 

 

{{spark-submit --master k8s://https://<ip>:<port> --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\<username>\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

 

I am getting this error:

 

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}

 

 

I also tried:

 

 

{{spark-submit --help}}

 

 

to see what I can get regarding the *--master* property. This is what I get:

 

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to