[ 
https://issues.apache.org/jira/browse/SPARK-33063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-33063.
-----------------------------------
    Fix Version/s: 3.1.0
         Assignee: German Schiavon Matteo  (was: Apache Spark)
       Resolution: Fixed

This is resolved via https://github.com/apache/spark/pull/29941

> Improve error message for insufficient K8s volume confs
> -------------------------------------------------------
>
>                 Key: SPARK-33063
>                 URL: https://issues.apache.org/jira/browse/SPARK-33063
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Kubernetes
>    Affects Versions: 3.0.0, 3.0.1
>            Reporter: German Schiavon Matteo
>            Assignee: German Schiavon Matteo
>            Priority: Minor
>             Fix For: 3.1.0
>
>
> Providing error handling when creating k8s volumes and clearer error messages.
> For example, when creating a *hostPath* volume, if you don't specify 
> {code:java}
> hostPath.volumeName.options.path
> {code}
>  it fails with a 
> {code:java}
>  key not found error
> {code}
>  which is clear that you are missing a key, but I couldn't find anywhere in 
> the docs that says you need to specify it.
> To reproduce the issue, you have to do the spark-submit command like this for 
> example:
> {code:java}
> ./bin/spark-submit \
> --master k8s://https://127.0.0.1:32768 \
> --deploy-mode cluster \
> --name spark-app\
> --class class \
> --conf spark.kubernetes.driver.volumes.hostPath.spark.mount.path=/tmp/jars/ \
> --conf spark.kubernetes.executor.volumes.hostPath.spark.mount.path=/tmp/jars/ 
> \
> --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \
> --conf spark.kubernetes.container.image=spark:latest \
>  local:///opt/spark/examples/jars/app.jar
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to