I don't know this code well, but yes seems like something is looking for members of a companion object when there is none here. Can you show any more of the stack trace or generated code?
On Thu, Apr 29, 2021 at 7:40 AM Rico Bergmann <i...@ricobergmann.de> wrote: > Hi all! > > A simplified code snippet of what my Spark pipeline written in Java does: > > public class MyPojo implements Serializable { > > ... // some fields with Getter and Setter > > } > > > a custom Aggregator (defined in the Driver class): > > public static MyAggregator extends > org.apache.spark.sql.expressions.Aggregator<Row, MyPojo, MyPojo> { ... } > > > in my Driver I do: > > Dataset<Row> inputDF = ... //some calculations before > > inputDF.groupBy("col1", "col2", "col3").agg(new > MyAggregator().toColumn().name("aggregated"); > > > When executing this part I get a CompileException complaining about an > unknown variable or type "MyPojo$.MODULE$". For me it looks like the > CodeGenerator generates code for Scala (since as far as I know .MODULE$ > is a scala specific variable). I tried it with Spark 3.1.1 and Spark 3.0.1. > > Does anyone have an idea what's going wrong here? > > > Best, > > Rico. > > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >