This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 992b69dbc327 [SPARK-47774][INFRA] Remove redundant rules from 
`MimaExcludes`
992b69dbc327 is described below

commit 992b69dbc3279824d7cc3b330a70a1bd5a7ab2b9
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Tue Apr 9 00:15:43 2024 -0700

    [SPARK-47774][INFRA] Remove redundant rules from `MimaExcludes`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to remove redundant rules from `MimaExcludes` for Apache Spark 
4.0.0.
    
    Previously, these rules were required due to the `dev/mima` limitation 
which is fixed at
    - https://github.com/apache/spark/pull/45938
    
    ### Why are the changes needed?
    
    To minimize the exclusion rules for Apache Spark 4.0.0 by removing the 
following `private class` rules.
    
    - `BasePythonRunner`
    
https://github.com/apache/spark/blob/319edfdc5cd6731d1d630a8beeea5b23a2326f07/core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala#L102
    
    - `CoarseGrainedClusterMessage`
    
https://github.com/apache/spark/blob/319edfdc5cd6731d1d630a8beeea5b23a2326f07/core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedClusterMessage.scala#L30
    
    - `MethodIdentifier`
    
https://github.com/apache/spark/blob/319edfdc5cd6731d1d630a8beeea5b23a2326f07/common/utils/src/main/scala/org/apache/spark/util/ClosureCleaner.scala#L1002
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No
    
    Closes #45944 from dongjoon-hyun/SPARK-47774.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 project/MimaExcludes.scala | 6 ------
 1 file changed, 6 deletions(-)

diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index 630dd1d77cc7..4016c5f8b3e5 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -38,17 +38,11 @@ object MimaExcludes {
     // [SPARK-44863][UI] Add a button to download thread dump as a txt in 
Spark UI
     
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.ThreadStackTrace.*"),
     
ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.status.api.v1.ThreadStackTrace$"),
-    // [SPARK-44705][PYTHON] Make PythonRunner single-threaded
-    
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.api.python.BasePythonRunner#ReaderIterator.this"),
-    // [SPARK-44198][CORE] Support propagation of the log level to the 
executors
-    
ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$SparkAppConfig$"),
     //[SPARK-46399][Core] Add exit status to the Application End event for the 
use of Spark Listener
     
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerApplicationEnd.*"),
     
ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.scheduler.SparkListenerApplicationEnd$"),
     // [SPARK-45427][CORE] Add RPC SSL settings to SSLOptions and 
SparkTransportConf
     
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.network.netty.SparkTransportConf.fromSparkConf"),
-    // [SPARK-45136][CONNECT] Enhance ClosureCleaner with Ammonite support
-    
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.util.MethodIdentifier$"),
     // [SPARK-45022][SQL] Provide context for dataset API errors
     
ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.QueryContext.contextType"),
     
ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.QueryContext.code"),


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to