[jira] [Commented] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient
[ https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17619249#comment-17619249 ] jiangjian commented on SPARK-40814: --- [~hyukjin.kwon] How to correctly modify the users in the Spark image? I used the image in 3.1.2 and reported the following error。 22/10/18 05:14:58 ERROR CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to ./spark-examples_2.12-3.1.2.jar java.nio.file.AccessDeniedException: ./spark-examples_2.12-3.1.2.jar This is my dockerfile。[^Dockerfile] > Exception in thread "main" java.lang.NoClassDefFoundError: > io/fabric8/kubernetes/client/KubernetesClient > > > Key: SPARK-40814 > URL: https://issues.apache.org/jira/browse/SPARK-40814 > Project: Spark > Issue Type: Bug > Components: Deploy, Spark Submit >Affects Versions: 2.4.0 > Environment: k8s version: v1.18.9 > spark version: v2.4.0 > kubernetes-client:v6.1.1 >Reporter: jiangjian >Priority: Major > Attachments: Dockerfile, Dockerfile-1, Dockerfile-2, spark-error.log > > > After I change the user in the Spark image, the running program reports an > error. What is the problem > ++ id -u > + myuid=2023 > ++ id -g > + mygid=2023 > + set +e > ++ getent passwd 2023 > + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh > + set -e > + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']' > + SPARK_K8S_CMD=driver > + case "$SPARK_K8S_CMD" in > + shift 1 > + SPARK_CLASSPATH=':/opt/spark/jars/*' > + env > + grep SPARK_JAVA_OPT_ > + sort -t_ -k4 -n > + sed 's/[^=]*=\(.*\)/\1/g' > + readarray -t SPARK_EXECUTOR_JAVA_OPTS > + '[' -n '' ']' > + '[' -n '' ']' > + PYSPARK_ARGS= > + '[' -n '' ']' > + R_ARGS= > + '[' -n '' ']' > + '[' '' == 2 ']' > + '[' '' == 3 ']' > + case "$SPARK_K8S_CMD" in > + CMD=("$SPARK_HOME/bin/spark-submit" --conf > "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client > "$@") > + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf > spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file > /opt/spark/conf/spark.properties --class > com.frontier.pueedas.computer.batchTool.etl.EtlScheduler > 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false' > configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS > startDate=2022-08-02 endDate=2022-08-03 > _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml > runMode=TEST > 2022-10-14 06:52:21 WARN NativeCodeLoader:62 - Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 2022-10-14 06:52:29 INFO SparkContext:54 - Running Spark version 2.4.0 > 2022-10-14 06:52:29 INFO SparkContext:54 - Submitted application: > [TEST]ETL[2022-08-02 00:00:00,2022-08-03 > 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml} > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing view acls to: > zndw,root > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing modify acls to: > zndw,root > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing view acls groups to: > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing modify acls groups > to: > 2022-10-14 06:52:29 INFO SecurityManager:54 - SecurityManager: > authentication disabled; ui acls disabled; users with view permissions: > Set(zndw, root); groups with view permissions: Set(); users with modify > permissions: Set(zndw, root); groups with modify permissions: Set() > 2022-10-14 06:52:29 INFO Utils:54 - Successfully started service > 'sparkDriver' on port 7078. > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering MapOutputTracker > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering BlockManagerMaster > 2022-10-14 06:52:29 INFO BlockManagerMasterEndpoint:54 - Using > org.apache.spark.storage.DefaultTopologyMapper for getting topology > information > 2022-10-14 06:52:29 INFO BlockManagerMasterEndpoint:54 - > BlockManagerMasterEndpoint up > 2022-10-14 06:52:29 INFO DiskBlockManager:54 - Created local directory at > /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3 > 2022-10-14 06:52:29 INFO MemoryStore:54 - MemoryStore started with capacity > 912.3 MB > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering OutputCommitCoordinator > 2022-10-14 06:52:30 INFO log:192 - Logging initialized @9926ms > 2022-10-14 06:52:30 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: > unknown, git hash: unknown > 2022-10-14 06:52:30 INFO Server:419 - Started @10035ms > 2022-10-14 06:52:30 INFO AbstractConnector:278 - Started > ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.
[jira] [Commented] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient
[ https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17618386#comment-17618386 ] Hyukjin Kwon commented on SPARK-40814: -- Spark 2.4.x is EOL. Mind trying if the same issue persists in Spark 3+? > Exception in thread "main" java.lang.NoClassDefFoundError: > io/fabric8/kubernetes/client/KubernetesClient > > > Key: SPARK-40814 > URL: https://issues.apache.org/jira/browse/SPARK-40814 > Project: Spark > Issue Type: Bug > Components: Deploy, Spark Submit >Affects Versions: 2.4.0 > Environment: k8s version: v1.18.9 > spark version: v2.4.0 > kubernetes-client:v6.1.1 >Reporter: jiangjian >Priority: Major > Attachments: Dockerfile, spark-error.log > > > After I change the user in the Spark image, the running program reports an > error. What is the problem > ++ id -u > + myuid=2023 > ++ id -g > + mygid=2023 > + set +e > ++ getent passwd 2023 > + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh > + set -e > + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']' > + SPARK_K8S_CMD=driver > + case "$SPARK_K8S_CMD" in > + shift 1 > + SPARK_CLASSPATH=':/opt/spark/jars/*' > + env > + grep SPARK_JAVA_OPT_ > + sort -t_ -k4 -n > + sed 's/[^=]*=\(.*\)/\1/g' > + readarray -t SPARK_EXECUTOR_JAVA_OPTS > + '[' -n '' ']' > + '[' -n '' ']' > + PYSPARK_ARGS= > + '[' -n '' ']' > + R_ARGS= > + '[' -n '' ']' > + '[' '' == 2 ']' > + '[' '' == 3 ']' > + case "$SPARK_K8S_CMD" in > + CMD=("$SPARK_HOME/bin/spark-submit" --conf > "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client > "$@") > + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf > spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file > /opt/spark/conf/spark.properties --class > com.frontier.pueedas.computer.batchTool.etl.EtlScheduler > 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false' > configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS > startDate=2022-08-02 endDate=2022-08-03 > _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml > runMode=TEST > 2022-10-14 06:52:21 WARN NativeCodeLoader:62 - Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 2022-10-14 06:52:29 INFO SparkContext:54 - Running Spark version 2.4.0 > 2022-10-14 06:52:29 INFO SparkContext:54 - Submitted application: > [TEST]ETL[2022-08-02 00:00:00,2022-08-03 > 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml} > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing view acls to: > zndw,root > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing modify acls to: > zndw,root > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing view acls groups to: > 2022-10-14 06:52:29 INFO SecurityManager:54 - Changing modify acls groups > to: > 2022-10-14 06:52:29 INFO SecurityManager:54 - SecurityManager: > authentication disabled; ui acls disabled; users with view permissions: > Set(zndw, root); groups with view permissions: Set(); users with modify > permissions: Set(zndw, root); groups with modify permissions: Set() > 2022-10-14 06:52:29 INFO Utils:54 - Successfully started service > 'sparkDriver' on port 7078. > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering MapOutputTracker > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering BlockManagerMaster > 2022-10-14 06:52:29 INFO BlockManagerMasterEndpoint:54 - Using > org.apache.spark.storage.DefaultTopologyMapper for getting topology > information > 2022-10-14 06:52:29 INFO BlockManagerMasterEndpoint:54 - > BlockManagerMasterEndpoint up > 2022-10-14 06:52:29 INFO DiskBlockManager:54 - Created local directory at > /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3 > 2022-10-14 06:52:29 INFO MemoryStore:54 - MemoryStore started with capacity > 912.3 MB > 2022-10-14 06:52:29 INFO SparkEnv:54 - Registering OutputCommitCoordinator > 2022-10-14 06:52:30 INFO log:192 - Logging initialized @9926ms > 2022-10-14 06:52:30 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: > unknown, git hash: unknown > 2022-10-14 06:52:30 INFO Server:419 - Started @10035ms > 2022-10-14 06:52:30 INFO AbstractConnector:278 - Started > ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} > 2022-10-14 06:52:30 INFO Utils:54 - Successfully started service 'SparkUI' > on port 4040. > 2022-10-14 06:52:30 INFO ContextHandler:781 - Started > o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark} > 2022-10-14 06:52:30 INFO ContextHandler:781 - Started > o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/j