[jira] [Commented] (SPARK-28921) Spark jobs failing on latest versions of Kubernetes (1.15.3, 1.14.6, 1,13.10, 1.12.10, 1.11.10)

2022-10-18 Thread jiangjian (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-28921?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17619927#comment-17619927
 ] 

jiangjian commented on SPARK-28921:
---

[~thesuperzapper] Where can I get these two jar packages  
(okhttp-*.jar,okio-*.jar)

> Spark jobs failing on latest versions of Kubernetes (1.15.3, 1.14.6, 1,13.10, 
> 1.12.10, 1.11.10)
> ---
>
> Key: SPARK-28921
> URL: https://issues.apache.org/jira/browse/SPARK-28921
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, Spark Core
>Affects Versions: 2.3.0, 2.3.1, 2.3.3, 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4
>Reporter: Paul Schweigert
>Assignee: Andy Grove
>Priority: Major
> Fix For: 2.4.5, 3.0.0
>
>
> Spark jobs are failing on latest versions of Kubernetes when jobs attempt to 
> provision executor pods (jobs like Spark-Pi that do not launch executors run 
> without a problem):
>  
> Here's an example error message:
>  
> {code:java}
> 19/08/30 01:29:09 INFO ExecutorPodsAllocator: Going to request 2 executors 
> from Kubernetes.
> 19/08/30 01:29:09 INFO ExecutorPodsAllocator: Going to request 2 executors 
> from Kubernetes.19/08/30 01:29:09 WARN WatchConnectionManager: Exec Failure: 
> HTTP 403, Status: 403 - 
> java.net.ProtocolException: Expected HTTP 101 response but was '403 
> Forbidden' 
> at 
> okhttp3.internal.ws.RealWebSocket.checkResponse(RealWebSocket.java:216) 
> at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:183) 
> at okhttp3.RealCall$AsyncCall.execute(RealCall.java:141) 
> at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32) 
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  
> at java.lang.Thread.run(Thread.java:748)
> {code}
>  
> Looks like the issue is caused by fixes for a recent CVE : 
> CVE: [https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14809]
> Fix: [https://github.com/fabric8io/kubernetes-client/pull/1669]
>  
> Looks like upgrading kubernetes-client to 4.4.2 would solve this issue.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-17 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Attachment: Dockerfile-2

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy, Spark Submit
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: Dockerfile, Dockerfile-1, Dockerfile-2, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Sta

[jira] [Commented] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-17 Thread jiangjian (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17619249#comment-17619249
 ] 

jiangjian commented on SPARK-40814:
---

[~hyukjin.kwon] How to correctly modify the users in the Spark image? I used 
the image in 3.1.2 and reported the following error。

22/10/18 05:14:58 ERROR CoarseGrainedExecutorBackend: Executor self-exiting due 
to : Unable to create executor due to ./spark-examples_2.12-3.1.2.jar
java.nio.file.AccessDeniedException: ./spark-examples_2.12-3.1.2.jar

This is my dockerfile。[^Dockerfile]

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy, Spark Submit
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: Dockerfile, Dockerfile-1, Dockerfile-2, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-17 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Attachment: Dockerfile-1

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy, Spark Submit
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: Dockerfile, Dockerfile-1, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Priority: Blocker  (was: Major)

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy, Spark Submit
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Blocker
> Attachments: Dockerfile, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.Ser

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Component/s: Spark Submit

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy, Spark Submit
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: Dockerfile, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletCont

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Attachment: Dockerfile

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: Dockerfile, spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@1894e4

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Environment: 
k8s version: v1.18.9

spark version: v2.4.0

kubernetes-client:v6.1.1

  was:
k8s version: v1.18.9

spark version: v2.4.0


> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
> kubernetes-client:v6.1.1
>Reporter: jiangjian
>Priority: Major
> Attachments: spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@S

[jira] [Updated] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangjian updated SPARK-40814:
--
Attachment: spark-error.log

> Exception in thread "main" java.lang.NoClassDefFoundError: 
> io/fabric8/kubernetes/client/KubernetesClient
> 
>
> Key: SPARK-40814
> URL: https://issues.apache.org/jira/browse/SPARK-40814
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 2.4.0
> Environment: k8s version: v1.18.9
> spark version: v2.4.0
>Reporter: jiangjian
>Priority: Major
> Attachments: spark-error.log
>
>
> After I change the user in the Spark image, the running program reports an 
> error. What is the problem
> ++ id -u
> + myuid=2023
> ++ id -g
> + mygid=2023
> + set +e
> ++ getent passwd 2023
> + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
> + set -e
> + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
> + SPARK_K8S_CMD=driver
> + case "$SPARK_K8S_CMD" in
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sort -t_ -k4 -n
> + sed 's/[^=]*=\(.*\)/\1/g'
> + readarray -t SPARK_EXECUTOR_JAVA_OPTS
> + '[' -n '' ']'
> + '[' -n '' ']'
> + PYSPARK_ARGS=
> + '[' -n '' ']'
> + R_ARGS=
> + '[' -n '' ']'
> + '[' '' == 2 ']'
> + '[' '' == 3 ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=("$SPARK_HOME/bin/spark-submit" --conf 
> "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client 
> "$@")
> + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
> spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
> /opt/spark/conf/spark.properties --class 
> com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
> 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
>  configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
> startDate=2022-08-02 endDate=2022-08-03 
> _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
>  runMode=TEST
> 2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
> 2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
> [TEST]ETL[2022-08-02 00:00:00,2022-08-03 
> 00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
> zndw,root
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups 
> to: 
> 2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: 
> authentication disabled; ui acls disabled; users  with view permissions: 
> Set(zndw, root); groups with view permissions: Set(); users  with modify 
> permissions: Set(zndw, root); groups with modify permissions: Set()
> 2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 
> 'sparkDriver' on port 7078.
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
> /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
> 2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
> 912.3 MB
> 2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
> 2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
> 2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
> unknown, git hash: unknown
> 2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
> 2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
> ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' 
> on port 4040.
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
> 2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@1894e40d\{/jobs/job,null,AVAILABLE,@Spar

[jira] [Created] (SPARK-40814) Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

2022-10-16 Thread jiangjian (Jira)
jiangjian created SPARK-40814:
-

 Summary: Exception in thread "main" 
java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient
 Key: SPARK-40814
 URL: https://issues.apache.org/jira/browse/SPARK-40814
 Project: Spark
  Issue Type: Bug
  Components: Deploy
Affects Versions: 2.4.0
 Environment: k8s version: v1.18.9

spark version: v2.4.0
Reporter: jiangjian


After I change the user in the Spark image, the running program reports an 
error. What is the problem

++ id -u
+ myuid=2023
++ id -g
+ mygid=2023
+ set +e
++ getent passwd 2023
+ uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
+ set -e
+ '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
+ SPARK_K8S_CMD=driver
+ case "$SPARK_K8S_CMD" in
+ shift 1
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -n '' ']'
+ PYSPARK_ARGS=
+ '[' -n '' ']'
+ R_ARGS=
+ '[' -n '' ']'
+ '[' '' == 2 ']'
+ '[' '' == 3 ']'
+ case "$SPARK_K8S_CMD" in
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf 
"spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf 
spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file 
/opt/spark/conf/spark.properties --class 
com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 
'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false'
 configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS 
startDate=2022-08-02 endDate=2022-08-03 
_file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml
 runMode=TEST
2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: 
[TEST]ETL[2022-08-02 00:00:00,2022-08-03 
00:00:00]\{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: zndw,root
2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: 
zndw,root
2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups to: 
2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(zndw, root); 
groups with view permissions: Set(); users  with modify permissions: Set(zndw, 
root); groups with modify permissions: Set()
2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 'sparkDriver' 
on port 7078.
2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - 
BlockManagerMasterEndpoint up
2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at 
/var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 
912.3 MB
2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: 
unknown, git hash: unknown
2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started 
ServerConnector@66f0548d\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' on 
port 4040.
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@59ed3e6c\{/jobs,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@70c53dbe\{/jobs/json,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1894e40d\{/jobs/job,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@7342e05d\{/jobs/job/json,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@2a331b46\{/stages,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@15383681\{/stages/json,null,AVAILABLE,@Spark}
2022-10-14 06:52:30 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@743e66f7\{/stages/stag