[ 
https://issues.apache.org/jira/browse/SPARK-38930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sichun zhai updated SPARK-38930:
--------------------------------
    Description: 
In  standalone deploy mode, try run spark org.apache.spark.examples.SparkPi or 
other spark program ,the ui always show Executors  status is killed

spark worker error log :

22/04/18 17:08:27 INFO Worker: Asked to kill executor app-20220418170822-0039/0
22/04/18 17:08:27 INFO ExecutorRunner: Runner thread for executor 
app-20220418170822-0039/0 interrupted
22/04/18 17:08:27 INFO ExecutorRunner: Killing process!
22/04/18 17:08:27 DEBUG SizeBasedRollingPolicy: 55 + 18896 > 1073741824
22/04/18 17:08:27 DEBUG RollingFileAppender: Closed file 
/opt/spark/work/app-20220418170822-0039/0/stderr
22/04/18 17:08:27 DEBUG RollingFileAppender: Closed file 
/opt/spark/work/app-20220418170822-0039/0/stdout
22/04/18 17:08:27 INFO ExecutorRunner: exitCode:Some(143)
22/04/18 17:08:27 INFO Worker: Executor app-20220418170822-0039/0 finished with 
state KILLED exitStatus 143

 

haved patch  [https://github.com/apache/spark/pull/12012]

run SparkPi commad:

/opt/app/applications/bd-spark/bin/run-example  -{-}class 
org.apache.spark.examples.SparkPi{-}  -master 
spark://10.205.90.120:7077,10.205.90.131:7077 --deploy-mode cluster 
--driver-java-options 
"-Dlog4j.configuration=[file:/opt/app/applications/bd-spark/conf/log4j.properties|file://opt/app/applications/bd-spark/conf/log4j.properties]"
 --conf 
spark.executor.extraJavaOptions="-Dlog4j.configuration=[file:/opt/app/applications/bd-spark/conf/log4j.properties|file://opt/app/applications/bd-spark/conf/log4j.properties]"

 

  was:
In  standalone deploy mode, try run spark org.apache.spark.examples.SparkPi or 
other spark program ,the ui always show Executors  status is killed

haved patch  [https://github.com/apache/spark/pull/12012]

run SparkPi commad:

/opt/app/applications/bd-spark/bin/run-example  --class 
org.apache.spark.examples.SparkPi  --master 
spark://10.205.90.120:7077,10.205.90.131:7077 --deploy-mode cluster 
--driver-java-options 
"-Dlog4j.configuration=file:/opt/app/applications/bd-spark/conf/log4j.properties"
 --conf 
spark.executor.extraJavaOptions="-Dlog4j.configuration=file:/opt/app/applications/bd-spark/conf/log4j.properties"

 


> Spark  Executors  status always is KILLED
> -----------------------------------------
>
>                 Key: SPARK-38930
>                 URL: https://issues.apache.org/jira/browse/SPARK-38930
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.2, 3.1.3
>            Reporter: sichun zhai
>            Priority: Major
>         Attachments: spark-default.conf, spark-env.sh, spark-ui.png, stderr
>
>
> In  standalone deploy mode, try run spark org.apache.spark.examples.SparkPi 
> or other spark program ,the ui always show Executors  status is killed
> spark worker error log :
> 22/04/18 17:08:27 INFO Worker: Asked to kill executor 
> app-20220418170822-0039/0
> 22/04/18 17:08:27 INFO ExecutorRunner: Runner thread for executor 
> app-20220418170822-0039/0 interrupted
> 22/04/18 17:08:27 INFO ExecutorRunner: Killing process!
> 22/04/18 17:08:27 DEBUG SizeBasedRollingPolicy: 55 + 18896 > 1073741824
> 22/04/18 17:08:27 DEBUG RollingFileAppender: Closed file 
> /opt/spark/work/app-20220418170822-0039/0/stderr
> 22/04/18 17:08:27 DEBUG RollingFileAppender: Closed file 
> /opt/spark/work/app-20220418170822-0039/0/stdout
> 22/04/18 17:08:27 INFO ExecutorRunner: exitCode:Some(143)
> 22/04/18 17:08:27 INFO Worker: Executor app-20220418170822-0039/0 finished 
> with state KILLED exitStatus 143
>  
> haved patch  [https://github.com/apache/spark/pull/12012]
> run SparkPi commad:
> /opt/app/applications/bd-spark/bin/run-example  -{-}class 
> org.apache.spark.examples.SparkPi{-}  -master 
> spark://10.205.90.120:7077,10.205.90.131:7077 --deploy-mode cluster 
> --driver-java-options 
> "-Dlog4j.configuration=[file:/opt/app/applications/bd-spark/conf/log4j.properties|file://opt/app/applications/bd-spark/conf/log4j.properties]"
>  --conf 
> spark.executor.extraJavaOptions="-Dlog4j.configuration=[file:/opt/app/applications/bd-spark/conf/log4j.properties|file://opt/app/applications/bd-spark/conf/log4j.properties]"
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to