Re: Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-30 Thread Rico Bergmann
Indeed adding public constructors solved the problem... Thanks a lot! > Am 29.04.2021 um 18:53 schrieb Rico Bergmann : > >  > It didn’t have it. So I added public no args and all args constructors. But I > still get the same error > > > >>> Am 29.04.2021 um 17:47 schrieb Sean Owen :

Re: Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-29 Thread Rico Bergmann
It didn’t have it. So I added public no args and all args constructors. But I still get the same error > Am 29.04.2021 um 17:47 schrieb Sean Owen : > >  > From tracing the code a bit, it might do this if the POJO class has no public > constructors - does it? > >> On Thu, Apr 29, 2021

Re: Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-29 Thread Sean Owen
>From tracing the code a bit, it might do this if the POJO class has no public constructors - does it? On Thu, Apr 29, 2021 at 9:55 AM Rico Bergmann wrote: > Here is the relevant generated code and the Exception stacktrace. > > The problem in the generated code is at line 35. > >

Re: Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-29 Thread Rico Bergmann
Here is the relevant generated code and the Exception stacktrace. The problem in the generated code is at line 35. /* 001 */ public java.lang.Object generate(Object[] references) { /* 002 */ return new SpecificSafeProjection(references); /* 003 */ } /* 004 */ /* 005 */ class

Re: Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-29 Thread Sean Owen
I don't know this code well, but yes seems like something is looking for members of a companion object when there is none here. Can you show any more of the stack trace or generated code? On Thu, Apr 29, 2021 at 7:40 AM Rico Bergmann wrote: > Hi all! > > A simplified code snippet of what my

Spark DataFrame CodeGeneration in Java generates Scala specific code?

2021-04-29 Thread Rico Bergmann
Hi all! A simplified code snippet of what my Spark pipeline written in Java does: public class MyPojo implements Serializable { ... // some fields with Getter and Setter } a custom Aggregator (defined in the Driver class): public static MyAggregator extends