Hi,
I'm trying to convert scala spark job into java.
In case of scala, I typically use 'case class' to apply schema to RDD.
It can be converted into POJO class in java, but what I really want to do is
dynamically creating POJO classes like scala REPL do.
For this reason, I import javassist to create POJO class in runtime easily.
But the problem is Worker nodes can't find this class.
The error message is..
 host workernode2.com: java.lang.ClassNotFoundException: GeneratedClass_no1
 java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        java.security.AccessController.doPrivileged(Native Method)
        java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        java.lang.Class.forName0(Native Method)
        java.lang.Class.forName(Class.java:266)
Generated class's classloader is
'Thread.currentThread().getContextClassLoader()'.
I expect it can be visible for Driver-node but Worker node's executor can
not see it.
Are changing classloader for loading Generated class and broadcasting
Generated class by spark context effective?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Case-class-in-java-tp8720.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to