[
https://issues.apache.org/jira/browse/SPARK-55330?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18056611#comment-18056611
]
bharath commented on SPARK-55330:
---------------------------------
Any luck ? I explored all the ways to fix the local dir. Even describe drive
pod shows the volume mount is available.
Mounts:
/apps/application/data from spark-local-dir-1 (rw)
/opt/spark/conf from spark-conf-volume-driver (rw)
/opt/spark/pod-template from pod-template-volume (rw)
Volumes:
spark-local-dir-1:
Type: PersistentVolumeClaim (a reference to a PersistentVolumeClaim
in the same namespace)
ClaimName: test-eg1-1b5b769c2d281c4e-driver-pvc-0
ReadOnly: false
> Unable to create PVC through Spark 4.1.0
> ----------------------------------------
>
> Key: SPARK-55330
> URL: https://issues.apache.org/jira/browse/SPARK-55330
> Project: Spark
> Issue Type: Bug
> Components: Kubernetes, Spark Submit
> Affects Versions: 4.1.1
> Environment: TEST
> Reporter: bharath
> Priority: Blocker
> Labels: kubernetes, spark, spark-conf
> Attachments: wordcount.txt
>
>
> Hi Team,
>
> We have hosted spark on kubernetes environment. The existing spark versions
> are 3.3.2 and 3.5.2 and the job runs fine with the existing access pod
> template.yaml and sparkapplication.yaml.
>
> We have started testing spark 4.1.0 and while running the jobs, it is
> throwing the below error
>
> 26/01/12 09:46:35 ERROR JavaUtils: Failed to create directory
> /apps/application/data/blockmgr-2469dd51-bfd2-478d-9b3c-d5593bf21c26
> java.nio.file.AccessDeniedException:
> /apps/application/data/blockmgr-2469dd51-bfd2-478d-9b3c-d5593bf21c26
> at
> java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:90)
> at
> java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
> at
> java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
> at
> java.base/sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:397)
> at java.base/java.nio.file.Files.createDirectory(Files.java:700)
> at
> java.base/java.nio.file.Files.createAndCheckIsDirectory(Files.java:807)
>
>
> We have created namespaces in TEST region. for 1 each spark 3.3.2, spark
> 3.5.2 and spark 4.1.0. Issue persists with 4.1.0.
>
> We are using the below conf file in 3.5.2 and 4.1.0 versions.
> *spark-defaults.conf*
>
> spark.kubernetes.driver.ownPersistentVolumeClaim
> true
> spark.kubernetes.driver.reusePersistentVolumeClaim
> true
> spark.kubernetes.driver.waitToReusePersistentVolumeClaim
> true
> spark.shuffle.sort.io.plugin.class
>
> org.apache.spark.shuffle.KubernetesLocalDiskShuffleDataIO
> spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.claimName
> OnDemand
> spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.storageClass
> spark-sc
> spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.options.sizeLimit
> 5Gi
> spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.mount.path
> /apps/application/data
> spark.kubernetes.driver.volumes.persistentVolumeClaim.spark-local-dir-1.mount.readOnly
> false
> spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.claimName
> OnDemand
> spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.storageClass
> spark-sc
> spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.options.sizeLimit
> 50Gi
> spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.mount.path
> /apps/application/data
> spark.kubernetes.executor.volumes.persistentVolumeClaim.spark-local-dir-1.mount.readOnly
> false
>
>
> *pod template file:*
>
> apiVersion: v1
> kind: Pod
> metadata:
> spec:
> securityContext:
> fsGroup: 4222
> fsGroupChangePolicy: OnRootMismatch
> containers:
> - name: spark-kubernetes-driver
> securityContext:
> runAsNonRoot: true
> capabilities:
> drop:
> - NET_BIND_SERVICE
> seccompProfile:
> type: RuntimeDefault
>
>
> *Dockerfile ( user account and group)*
>
>
> RUN groupadd -g 4222 spky && \
> useradd -u 92461 sparknix -d /home/sparknix -g 4222 -G spky -G root
> --no-log-init
>
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]