This is an automated email from the ASF dual-hosted git repository.

yao pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new e2f34c75a6ea [SPARK-48034][TESTS] NullPointerException in 
MapStatusesSerDeserBenchmark
e2f34c75a6ea is described below

commit e2f34c75a6ea686eb6fa4260584bc32b558ce01f
Author: Kent Yao <y...@apache.org>
AuthorDate: Mon Apr 29 11:40:39 2024 +0800

    [SPARK-48034][TESTS] NullPointerException in MapStatusesSerDeserBenchmark
    
    ### What changes were proposed in this pull request?
    
    This PR fixes an NPE in MapStatusesSerDeserBenchmark. The cause is that we 
try to stop the tracker twice.
    
    ```
    3197java.lang.NullPointerException: Cannot invoke 
"org.apache.spark.rpc.RpcEndpointRef.askSync(Object, scala.reflect.ClassTag)" 
because the return value of 
"org.apache.spark.MapOutputTracker.trackerEndpoint()" is null
    3198    at 
org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:541)
    3199    at 
org.apache.spark.MapOutputTracker.sendTracker(MapOutputTracker.scala:551)
    3200    at 
org.apache.spark.MapOutputTrackerMaster.stop(MapOutputTracker.scala:1242)
    3201    at org.apache.spark.SparkEnv.stop(SparkEnv.scala:112)
    3202    at 
org.apache.spark.SparkContext.$anonfun$stop$25(SparkContext.scala:2354)
    3203    at 
org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1294)
    3204    at org.apache.spark.SparkContext.stop(SparkContext.scala:2354)
    3205    at org.apache.spark.SparkContext.stop(SparkContext.scala:2259)
    3206    at 
org.apache.spark.MapStatusesSerDeserBenchmark$.afterAll(MapStatusesSerDeserBenchmark.scala:128)
    3207    at 
org.apache.spark.benchmark.BenchmarkBase.main(BenchmarkBase.scala:80)
    3208    at 
org.apache.spark.MapStatusesSerDeserBenchmark.main(MapStatusesSerDeserBenchmark.scala)
    3209    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    3210    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
    3211    at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    3212    at java.base/java.lang.reflect.Method.invoke(Method.java:568)
    3213    at 
org.apache.spark.benchmark.Benchmarks$.$anonfun$main$7(Benchmarks.scala:128)
    3214    at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
    3215    at org.apache.spark.benchmark.Benchmarks$.main(Benchmarks.scala:91)
    3216    at org.apache.spark.benchmark.Benchmarks.main(Benchmarks.scala)
    ```
    ### Why are the changes needed?
    
    test bugfix
    
    ### Does this PR introduce _any_ user-facing change?
    
    no
    
    ### How was this patch tested?
    
    manually
    
    ### Was this patch authored or co-authored using generative AI tooling?
    no
    
    Closes #46270 from yaooqinn/SPARK-48034.
    
    Authored-by: Kent Yao <y...@apache.org>
    Signed-off-by: Kent Yao <y...@apache.org>
    (cherry picked from commit 59d5946cfd377e9203ccf572deb34f87fab7510c)
    Signed-off-by: Kent Yao <y...@apache.org>
---
 core/src/test/scala/org/apache/spark/MapStatusesSerDeserBenchmark.scala | 1 -
 1 file changed, 1 deletion(-)

diff --git 
a/core/src/test/scala/org/apache/spark/MapStatusesSerDeserBenchmark.scala 
b/core/src/test/scala/org/apache/spark/MapStatusesSerDeserBenchmark.scala
index 797b650799ea..795da65079d6 100644
--- a/core/src/test/scala/org/apache/spark/MapStatusesSerDeserBenchmark.scala
+++ b/core/src/test/scala/org/apache/spark/MapStatusesSerDeserBenchmark.scala
@@ -123,7 +123,6 @@ object MapStatusesSerDeserBenchmark extends BenchmarkBase {
   }
 
   override def afterAll(): Unit = {
-    tracker.stop()
     if (sc != null) {
       sc.stop()
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to