Hello,
I am writing to check if what I am encountering is bug or the behavior
that is expected from Spark 3.4.x and over.
I've noticed that analysis quickly fails on a "/NoSuchElementException:
None.get/" with the JavaBeanEncoder in deserialization since 3.4.x, if a
candidate field has a accessor method on it, /getSomething()/,
/isSomething()/, but no setter associated.
The "/NoSuchElementException: None.get/" comes from the statement *
f.writeMethod.get -> setter*
that finds no setter for the getter, and fails on /None/.
case JavaBeanEncoder(tag, fields) =>
val setters = fields.map { f =>
val newTypePath = walkedTypePath.recordField(
f.enc.clsTag.runtimeClass.getName,
f.name)
val setter = expressionWithNullSafety(
deserializerFor(
f.enc,
addToPath(path, f.name, f.enc.dataType, newTypePath),
newTypePath),
nullable = f.nullable,
newTypePath)
f.writeMethod.get -> setter
}
Spark versions 3.3.x and below were allowing an accessor not to have a
setter linked to it.
I've found no indication on the migration guide that a rule is now
taking place, that enforces the writing of a setter for each existing
accessor.
A workaround is to rename these accessors with "/the new name/" that is
now in favor with Java Records, where *getSomething()* or
*isSomething()* accessors are renamed *something()*.
Then, Spark doesn't detect these accessors and won't stubble upon.
If it's the expected new behavior, would it be possible to handle the
detection of a missing setter smoothly?
a "/NoSuchElementException: None.get" /message stopping the analysis is
clueless. What ? Where ?
an error log like : "/no setter associated to the accessor {} for field
{} in class {}/" would be useful for the developer. With maybe mentioned
into it, the workaround I suggest.
Regards,
Marc Le Bihan
[Encoder fails on many "NoSuchElementException: None.get" since 3.4.x,
search for an encoder for a generic type, and since 3.5.x isn't "an
expression encoder"](https://issues.apache.org/jira/browse/SPARK-45311)