[jira] [Resolved] (SPARK-29152) Spark Executor Plugin API shutdown is not proper when dynamic allocation enabled

2019-12-10 Thread Marcelo Masiero Vanzin (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Masiero Vanzin resolved SPARK-29152.

Fix Version/s: 3.0.0
   Resolution: Fixed

Issue resolved by pull request 26810
[https://github.com/apache/spark/pull/26810]

> Spark Executor Plugin API shutdown is not proper when dynamic allocation 
> enabled
> 
>
> Key: SPARK-29152
> URL: https://issues.apache.org/jira/browse/SPARK-29152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4, 3.0.0
>Reporter: jobit mathew
>Assignee: Rakesh Raushan
>Priority: Major
> Fix For: 3.0.0
>
>
> *Issue Description*
> Spark Executor Plugin API *shutdown handling is not proper*, when dynamic 
> allocation enabled .Plugin's shutdown method is not processed when dynamic 
> allocation is enabled and *executors become dead* after inactive time.
> *Test Precondition*
> 1. Create a plugin and make a jar named SparkExecutorplugin.jar
> import org.apache.spark.ExecutorPlugin;
> public class ExecutoTest1 implements ExecutorPlugin{
> public void init(){
> System.out.println("Executor Plugin Initialised.");
> }
> public void shutdown(){
> System.out.println("Executor plugin closed successfully.");
> }
> }
> 2. Create the  jars with the same and put it in folder /spark/examples/jars
> *Test Steps*
> 1. launch bin/spark-sql with dynamic allocation enabled
> ./spark-sql --master yarn --conf spark.executor.plugins=ExecutoTest1  --jars 
> /opt/HA/C10/install/spark/spark/examples/jars/SparkExecutorPlugin.jar --conf 
> spark.dynamicAllocation.enabled=true --conf 
> spark.dynamicAllocation.initialExecutors=2 --conf 
> spark.dynamicAllocation.minExecutors=1
> 2 create a table , insert the data and select * from tablename
> 3.Check the spark UI Jobs tab/SQL tab
> 4. Check all Executors(executor tab will give all executors details) 
> application log file for Executor plugin Initialization and Shutdown messages 
> or operations.
> Example 
> /yarn/logdir/application_1567156749079_0025/container_e02_1567156749079_0025_01_05/
>  stdout
> 5. Wait for the executor to be dead after the inactive time and check the 
> same container log 
> 6. Kill the spark sql and check the container log  for executor plugin 
> shutdown.
> *Expect Output*
> 1. Job should be success. Create table ,insert and select query should be 
> success.
> 2.While running query All Executors  log should contain the executor plugin 
> Init messages or operations.
> "Executor Plugin Initialised.
> 3.Once the executors are dead ,shutdown message should be there in log file.
> “ Executor plugin closed successfully.
> 4.Once the sql application closed ,shutdown message should be there in log.
> “ Executor plugin closed successfully". 
> *Actual Output*
> Shutdown message is not called when executor is dead after inactive time.
> *Observation*
> Without dynamic allocation Executor plugin is working fine. But after 
> enabling dynamic allocation,Executor shutdown is not processed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-29152) Spark Executor Plugin API shutdown is not proper when dynamic allocation enabled

2019-11-30 Thread Sean R. Owen (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean R. Owen resolved SPARK-29152.
--
Resolution: Won't Fix

> Spark Executor Plugin API shutdown is not proper when dynamic allocation 
> enabled
> 
>
> Key: SPARK-29152
> URL: https://issues.apache.org/jira/browse/SPARK-29152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4
>Reporter: jobit mathew
>Priority: Major
>
> *Issue Description*
> Spark Executor Plugin API *shutdown handling is not proper*, when dynamic 
> allocation enabled .Plugin's shutdown method is not processed when dynamic 
> allocation is enabled and *executors become dead* after inactive time.
> *Test Precondition*
> 1. Create a plugin and make a jar named SparkExecutorplugin.jar
> import org.apache.spark.ExecutorPlugin;
> public class ExecutoTest1 implements ExecutorPlugin{
> public void init(){
> System.out.println("Executor Plugin Initialised.");
> }
> public void shutdown(){
> System.out.println("Executor plugin closed successfully.");
> }
> }
> 2. Create the  jars with the same and put it in folder /spark/examples/jars
> *Test Steps*
> 1. launch bin/spark-sql with dynamic allocation enabled
> ./spark-sql --master yarn --conf spark.executor.plugins=ExecutoTest1  --jars 
> /opt/HA/C10/install/spark/spark/examples/jars/SparkExecutorPlugin.jar --conf 
> spark.dynamicAllocation.enabled=true --conf 
> spark.dynamicAllocation.initialExecutors=2 --conf 
> spark.dynamicAllocation.minExecutors=1
> 2 create a table , insert the data and select * from tablename
> 3.Check the spark UI Jobs tab/SQL tab
> 4. Check all Executors(executor tab will give all executors details) 
> application log file for Executor plugin Initialization and Shutdown messages 
> or operations.
> Example 
> /yarn/logdir/application_1567156749079_0025/container_e02_1567156749079_0025_01_05/
>  stdout
> 5. Wait for the executor to be dead after the inactive time and check the 
> same container log 
> 6. Kill the spark sql and check the container log  for executor plugin 
> shutdown.
> *Expect Output*
> 1. Job should be success. Create table ,insert and select query should be 
> success.
> 2.While running query All Executors  log should contain the executor plugin 
> Init messages or operations.
> "Executor Plugin Initialised.
> 3.Once the executors are dead ,shutdown message should be there in log file.
> “ Executor plugin closed successfully.
> 4.Once the sql application closed ,shutdown message should be there in log.
> “ Executor plugin closed successfully". 
> *Actual Output*
> Shutdown message is not called when executor is dead after inactive time.
> *Observation*
> Without dynamic allocation Executor plugin is working fine. But after 
> enabling dynamic allocation,Executor shutdown is not processed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org