This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6767053dacd9 [SPARK-48218][CORE] TransportClientFactory.createClient 
may NPE cause FetchFailedException
6767053dacd9 is described below

commit 6767053dacd9df623336e1f5faabf1eb16b7a7dd
Author: sychen <syc...@ctrip.com>
AuthorDate: Wed May 15 09:33:39 2024 -0700

    [SPARK-48218][CORE] TransportClientFactory.createClient may NPE cause 
FetchFailedException
    
    ### What changes were proposed in this pull request?
    This PR aims to add a check for `TransportChannelHandler` to be non-null in 
the `TransportClientFactory.createClient` method.
    
    ### Why are the changes needed?
    
    Line 178 synchronized (handler) , handler == null
    
    
org.apache.spark.network.client.TransportClientFactory#createClient(java.lang.String,
 int, boolean)
    ```java
          TransportChannelHandler handler = cachedClient.getChannel().pipeline()
            .get(TransportChannelHandler.class);
          synchronized (handler) {
            handler.getResponseHandler().updateTimeOfLastRequest();
          }
    ```
    
    ```java
    org.apache.spark.shuffle.FetchFailedException
            at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180)
            at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913)
            at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
            at 
org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
    
    Caused by: java.lang.NullPointerException
            at 
org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178)
            at 
org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128)
            at 
org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154)
            at 
org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133)
            at 
org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139)
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #46506 from cxzl25/SPARK-48218.
    
    Authored-by: sychen <syc...@ctrip.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .../org/apache/spark/network/client/TransportClientFactory.java     | 6 ++++--
 1 file changed, 4 insertions(+), 2 deletions(-)

diff --git 
a/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java
 
b/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java
index ddf1b3cce349..f2dbfd92b854 100644
--- 
a/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java
+++ 
b/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java
@@ -171,8 +171,10 @@ public class TransportClientFactory implements Closeable {
       // this code was able to update things.
       TransportChannelHandler handler = cachedClient.getChannel().pipeline()
         .get(TransportChannelHandler.class);
-      synchronized (handler) {
-        handler.getResponseHandler().updateTimeOfLastRequest();
+      if (handler != null) {
+        synchronized (handler) {
+          handler.getResponseHandler().updateTimeOfLastRequest();
+        }
       }
 
       if (cachedClient.isActive()) {


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to