Sabarish kumar created SPARK-55010:
--------------------------------------

             Summary: Spark job failed to create local dir for my job execution 
in spark 4.0.,1
                 Key: SPARK-55010
                 URL: https://issues.apache.org/jira/browse/SPARK-55010
             Project: Spark
          Issue Type: Task
          Components: Spark Submit
    Affects Versions: 4.0.1
            Reporter: Sabarish kumar


Hello Team,

We are migrating our workspace from spark version 3.5.2 to spark 4.0.1 and 
while executing a job, it got failed with below error  even the same code and 
setup is working fine in spark 3.5.2 and my spark conf for the job  

Our jobs are designed in kubernetes cluster, the docker path have full 
permission to create this directory but still we are getting this access denied 
error, the pod template file in available in both local and s3, in both the 
scenario job is getting failed . please help us resolve this issue

Spark conf

= = =

    spark.kubernetes.driver.podTemplateFile:                                    
                     s3a://scp-corp-bucket/npe/pod/access-pod-template.yaml
    
spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.mount.path:
              /apps/application/data
    
spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.mount.readOnly:
          false
    
spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.claimName:
       OnDemand
    
spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.sizeLimit:
       5Gi
    
spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.storageClass:
    spark-sc
    spark.kubernetes.executor.podTemplateFile:                                  
                     s3a://scp-corp-bucket/npe/pod/access-pod-template.yaml
    
spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.mount.path:
            /apps/application/data
    
spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.mount.readOnly:
        false
    
spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.claimName:
     OnDemand
    
spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.sizeLimit:
     50Gi
    
spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.storageClass:
  spark-sc

 

ERROR
= = = 
26/01/12 09:46:35 ERROR JavaUtils: Failed to create directory 
/apps/application/data/blockmgr-2469dd51-bfd2-478d-9b3c-d5593bf21c26
java.nio.file.AccessDeniedException: 
/apps/application/data/blockmgr-2469dd51-bfd2-478d-9b3c-d5593bf21c26


POD Template

= = =
apiVersion: v1
kind: Pod
metadata:
spec:
  securityContext:
    fsGroup: 4222
    fsGroupChangePolicy: OnRootMismatch
  containers:

USER

= = =

sparknix@33d77f449eb9:/opt/spark$ id
uid=92461(sparknix) gid=4222(spky) groups=4222(spky),0(root)
    - name: spark-kubernetes-driver
      securityContext:
        runAsNonRoot: true
        capabilities:
          drop:
            - NET_BIND_SERVICE
        seccompProfile:
          type: RuntimeDefault



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to