[ 
https://issues.apache.org/jira/browse/SPARK-45311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17785275#comment-17785275
 ] 

Marc Le Bihan edited comment on SPARK-45311 at 11/12/23 10:56 AM:
------------------------------------------------------------------

The problem *java.util.NoSuchElementException: None.get*   happens in 
`{color:#0033b3} 
{color}{color:#000000}JavaBeanEncoder{color}({color:#000000}tag{color}, 
{color:#000000}fields{color})` case  of 
`deserializerFor` method in `ScalaReflexion`

(sorry for the indentation that I had to change due to a bad copy-paste)

 
{code:java}
case JavaBeanEncoder(tag, fields) =>
      val setters = fields.map { f =>
            val newTypePath = walkedTypePath.recordField(
                f.enc.clsTag.runtimeClass.getName,
                f.name)
    val setter = expressionWithNullSafety(
        deserializerFor(
            f.enc,
            addToPath(path, f.name, f.enc.dataType, newTypePath),
            newTypePath),
          nullable = f.nullable,
          newTypePath)
     f.writeMethod.get -> setter
}
{code}
 
`f.writeMethod` is valued with `None` if it happens that the method observed, a 
`getSomething()` or `isSomething()` doesn't have a correspondant setter method 
`setSomething(...)`.

I cannot tell if it's the new wished behavior for Spark after the `3.4.x` or if 
it's a regression.

But until `3.3.x`, it was possible to have class having `isSomething()` 
methods, in order to do some checkings, without the need of having a setter.

—
 
The workaround is to rename a method not having a setter to `something()` only, 
without `get` or `is` prefix.
Then Spark will skip it.


was (Author: mlebihan):
The problem *java.util.NoSuchElementException: None.get*   happens in 
`{color:#0033b3} 
{color}{color:#000000}JavaBeanEncoder{color}({color:#000000}tag{color}, 
{color:#000000}fields{color})` case  of 
`deserializerFor` method in `ScalaReflexion`

(sorry for the indentation that I had to change due to a bad copy-paste)

 
{code:java}
case JavaBeanEncoder(tag, fields) =>
      val setters = fields.map { f =>
            val newTypePath = walkedTypePath.recordField(
                f.enc.clsTag.runtimeClass.getName,
                f.name)
    val setter = expressionWithNullSafety(
        deserializerFor(
            f.enc,
            addToPath(path, f.name, f.enc.dataType, newTypePath),
            newTypePath),
          nullable = f.nullable,
          newTypePath)
     f.writeMethod.get -> setter {color}}
{code}

 
`f.writeMethod` is valued with `None` if it happens that the method observed, a 
`getSomething()` or `isSomething()` doesn't have a correspondant setter method 
`setSomething(...)`.

I cannot tell if it's the new wished behavior for Spark after the `3.4.x` or if 
it's a regression.

But until `3.3.x`, it was possible to have class having `isSomething()` 
methods, in order to do some checkings, without the need of having a setter.

—
 
The workaround is to rename a method not having a setter to `something()` only, 
without `get` or `is` prefix.
Then Spark will skip it.

> Encoder fails on many "NoSuchElementException: None.get" since 3.4.x, search 
> for an encoder for a generic type, and since 3.5.x isn't "an expression 
> encoder"
> -------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-45311
>                 URL: https://issues.apache.org/jira/browse/SPARK-45311
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.4.0, 3.4.1, 3.5.0
>         Environment: Debian 12
> Java 17
> Underlying Spring-Boot 2.7.14
>            Reporter: Marc Le Bihan
>            Priority: Major
>
> If you find it convenient, you might clone the 
> [https://gitlab.com/territoirevif/minimal-tests-spark-issue] project (that 
> does many operations around cities, local authorities and accounting with 
> open data) where I've extracted from my work what's necessary to make a set 
> of 35 tests that run correctly with Spark 3.3.x, and show the troubles 
> encountered with 3.4.x and 3.5.x.
>  
> It is working well with Spark 3.2.x, 3.3.x. But as soon as I selec{*}t Spark 
> 3.4.x{*}, where the encoder seems to have deeply changed, the encoder fails 
> with two problems:
>  
> *1)* It throws *java.util.NoSuchElementException: None.get* messages 
> everywhere.
> Asking over the Internet, I wasn't alone facing this problem. Reading it, 
> you'll see that I've attempted a debug but my Scala skills are low.
> [https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0]
> {color:#172b4d}by the way, if possible, the encoder and decoder functions 
> should forward a parameter as soon as the name of the field being handled is 
> known, and then all the long of their process, so that when the encoder is at 
> any point where it has to throw an exception, it knows the field it is 
> handling in its specific call and can send a message like:{color}
> {color:#00875a}_java.util.NoSuchElementException: None.get when encoding [the 
> method or field it was targeting]_{color}
>  
> *2)* *Not found an encoder of the type RS to Spark SQL internal 
> representation.* Consider to change the input type to one of supported at 
> (...)
> Or : Not found an encoder of the type *OMI_ID* to Spark SQL internal 
> representation (...)
>  
> where *RS* and *OMI_ID* are generic types.
> This is strange.
> [https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge]
>  
> *3)* When I switch to the *Spark 3.5.0* version, the same problems remain, 
> but another add itself to the list:
> "{*}Only expression encoders are supported for now{*}" on what was accepted 
> and working before.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to