This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 66f5f964f82 [SPARK-45081][SQL][3.4] Encoders.bean does no longer work 
with read-only properties
66f5f964f82 is described below

commit 66f5f964f8213e263e8aefb38a7e733753836995
Author: Giambattista Bloisi <gblo...@gmail.com>
AuthorDate: Mon Sep 18 13:39:54 2023 -0700

    [SPARK-45081][SQL][3.4] Encoders.bean does no longer work with read-only 
properties
    
    ### What changes were proposed in this pull request?
    This PR re-enables Encoders.bean to be called against beans having 
read-only properties, that is properties that have only getters and no setter 
method. Beans with read only properties are even used in internal tests.
    Setter methods of a Java bean encoder are stored within an Option wrapper 
because they are missing in case of read-only properties. When a java bean has 
to be initialized, setter methods for the bean properties have to be called: 
this PR filters out read-only properties from that process.
    
    ### Why are the changes needed?
    The changes are required to avoid an exception to the thrown by getting the 
value of a None option object.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    An additional regression test has been added
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    hvanhovell this is 3.4 branch port of 
[PR-42829](https://github.com/apache/spark/pull/42829)
    
    Closes #42913 from gbloisi-openaire/SPARK-45081-branch-3.4.
    
    Authored-by: Giambattista Bloisi <gblo...@gmail.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .../org/apache/spark/sql/catalyst/ScalaReflection.scala  |  4 +++-
 .../java/test/org/apache/spark/sql/JavaDatasetSuite.java | 16 ++++++++++++++++
 2 files changed, 19 insertions(+), 1 deletion(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
index 2e03f32a58d..b18613bdad3 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
@@ -345,7 +345,9 @@ object ScalaReflection extends ScalaReflection {
         CreateExternalRow(convertedFields, enc.schema))
 
     case JavaBeanEncoder(tag, fields) =>
-      val setters = fields.map { f =>
+      val setters = fields
+        .filter(_.writeMethod.isDefined)
+        .map { f =>
         val newTypePath = walkedTypePath.recordField(
           f.enc.clsTag.runtimeClass.getName,
           f.name)
diff --git 
a/sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java 
b/sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java
index 228b7855142..6a9ffef6991 100644
--- a/sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java
+++ b/sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java
@@ -1711,6 +1711,22 @@ public class JavaDatasetSuite implements Serializable {
     Assert.assertEquals(1, df.collectAsList().size());
   }
 
+  public static class ReadOnlyPropertyBean implements Serializable {
+    public boolean isEmpty() {
+      return true;
+    }
+  }
+
+  @Test
+  public void testReadOnlyPropertyBean() {
+    ReadOnlyPropertyBean bean = new ReadOnlyPropertyBean();
+    List<ReadOnlyPropertyBean> data = Arrays.asList(bean);
+    Dataset<ReadOnlyPropertyBean> df = spark.createDataset(data,
+            Encoders.bean(ReadOnlyPropertyBean.class));
+    Assert.assertEquals(1, df.schema().length());
+    Assert.assertEquals(1, df.collectAsList().size());
+  }
+
   public class CircularReference1Bean implements Serializable {
     private CircularReference2Bean child;
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to