[jira] [Commented] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16784758#comment-16784758
 ] 

Andreas Adamides commented on SPARK-27059:
--

Indeed, when in spark 2.4.0 and 2.3.3 running

*spark-submit --version*

returns "version 2.2.1" (as well as spark-shell)

So if not from the official Spark Download Page, where would I download the 
latest advertised spark version that supports Kubernetes.

> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
>  {{*Kubernetes master is running at https://:* }}
>  *{{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*
> Trying to run the SparkPi with the Spark release I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> *{{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*
> I am getting this error:
> *{{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}*
> I also tried:
> *{{spark-submit --help}}*
> to see what I can get regarding the *--master* property. This is what I get:
> *{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or 
> local.}}*
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
> [https://spark.apache.org/docs/latest/submitting-applications.html#master-urls]
>  ]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andreas Adamides updated SPARK-27059:
-
Description: 
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
 {{*Kubernetes master is running at https://:* }}
 *{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*

Trying to run the SparkPi with the Spark release I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
 {{*Kubernetes master is running at https://:* }}
 *{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
>  {{*Kubernetes master is running at https://:* }}
>  *{{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*
> Trying to run the SparkPi with the Spark release I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> *{{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*
> I am getting this error:
> *{{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}*
> I also tried:
> *{{spark-submit --help}}*
> to see what I can get regarding the *--master* property. This is what I get:
> *{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or 
> local.}}*
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
> 

[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andreas Adamides updated SPARK-27059:
-
Description: 
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
{{*Kubernetes master is running at https://:* }}
{{ *{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*}}

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

 

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:


{{C:\windows\system32>kubectl cluster-info }}
{{Kubernetes master is running at https://: }}
{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)


{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

I am getting this error:

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}

 

I also tried:

 

{{spark-submit --help}}

 

to see what I can get regarding the *--master* property. This is what I get:

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
> {{*Kubernetes master is running at https://:* }}
> {{ *{{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*}}
> Trying to run the SparkPi with the Spark I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> *{{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*
>  
> I am getting this error:
> *{{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}*
> I also tried:
> *{{spark-submit --help}}*
> to see what I can get regarding the *--master* property. This is what I get:
> *{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or 
> local.}}*
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
> 

[jira] [Created] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)
Andreas Adamides created SPARK-27059:


 Summary: spark-submit on kubernetes cluster does not recognise k8s 
--master property
 Key: SPARK-27059
 URL: https://issues.apache.org/jira/browse/SPARK-27059
 Project: Spark
  Issue Type: Bug
  Components: Kubernetes
Affects Versions: 2.4.0, 2.3.3
Reporter: Andreas Adamides


I have successfully installed a Kubernetes cluster and can verify this by:

 

 

{{C:\windows\system32>kubectl cluster-info Kubernetes master is running at 
https://: KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

 

Then I am trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

 

 

{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

 

I am getting this error:

 

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}

 

 

I also tried:

 

 

{{spark-submit --help}}

 

 

to see what I can get regarding the *--master* property. This is what I get:

 

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andreas Adamides updated SPARK-27059:
-
Description: 
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
 {{*Kubernetes master is running at https://:* }}
 *{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
 {{*Kubernetes master is running at https://:* }}
*{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

 

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
>  {{*Kubernetes master is running at https://:* }}
>  *{{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*
> Trying to run the SparkPi with the Spark I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> *{{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*
> I am getting this error:
> *{{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}*
> I also tried:
> *{{spark-submit --help}}*
> to see what I can get regarding the *--master* property. This is what I get:
> *{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or 
> local.}}*
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
> 

[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andreas Adamides updated SPARK-27059:
-
Description: 
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
 {{*Kubernetes master is running at https://:* }}
*{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

 

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:

{{C:\windows\system32>kubectl cluster-info }}
{{*Kubernetes master is running at https://:* }}
{{ *{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*}}

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

*{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*

 

I am getting this error:

*{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}*

I also tried:

*{{spark-submit --help}}*

to see what I can get regarding the *--master* property. This is what I get:

*{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}*

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
>  {{*Kubernetes master is running at https://:* }}
> *{{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}*
> Trying to run the SparkPi with the Spark I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> *{{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}*
>  
> I am getting this error:
> *{{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}*
> I also tried:
> *{{spark-submit --help}}*
> to see what I can get regarding the *--master* property. This is what I get:
> *{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or 
> local.}}*
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
> 

[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

2019-03-05 Thread Andreas Adamides (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andreas Adamides updated SPARK-27059:
-
Description: 
I have successfully installed a Kubernetes cluster and can verify this by:


{{C:\windows\system32>kubectl cluster-info }}
{{Kubernetes master is running at https://: }}
{{KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

Trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)


{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

I am getting this error:

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}

 

I also tried:

 

{{spark-submit --help}}

 

to see what I can get regarding the *--master* property. This is what I get:

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:

 

 

{{C:\windows\system32>kubectl cluster-info Kubernetes master is running at 
https://: KubeDNS is running at 
https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

 

Then I am trying to run the SparkPi with the Spark I downloaded from 
[https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

 

 

{{spark-submit --master k8s://https://: --deploy-mode cluster --name 
spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=2 --conf 
spark.kubernetes.container.image=gettyimages/spark 
c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

 

I am getting this error:

 

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with 
--help for usage help or --verbose for debug output}}

 

 

I also tried:

 

 

{{spark-submit --help}}

 

 

to see what I can get regarding the *--master* property. This is what I get:

 

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

 

According to the documentation 
[[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running 
Spark workloads in Kubernetes, spark-submit does not even seem to recognise the 
k8s value for master. [ included in possible Spark masters: 
[https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] 
]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---
>
> Key: SPARK-27059
> URL: https://issues.apache.org/jira/browse/SPARK-27059
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.3.3, 2.4.0
>Reporter: Andreas Adamides
>Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
> {{Kubernetes master is running at https://: }}
> {{KubeDNS is running at 
> https://:/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}
>  
> Trying to run the SparkPi with the Spark I downloaded from 
> [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> {{spark-submit --master k8s://https://: --deploy-mode cluster 
> --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
> spark.executor.instances=2 --conf 
> spark.kubernetes.container.image=gettyimages/spark 
> c:\users\\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}
>  
> I am getting this error:
>  
> {{Error: Master must either be yarn or start with spark, mesos, local Run 
> with --help for usage help or --verbose for debug output}}
>  
> I also tried:
>  
> {{spark-submit --help}}
>  
> to see what I can get regarding the *--master* property. This is what I get:
>  
> {{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}
>  
> According to the documentation 
> [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on 
> running Spark workloads in Kubernetes, spark-submit does not even seem to 
> recognise the k8s value for master. [ included in possible Spark masters: 
>