[ 
https://issues.apache.org/jira/browse/SPARK-4877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14305810#comment-14305810
 ] 

holdenk commented on SPARK-4877:
--------------------------------

Hi Matt,

I don't believe we need to override loadClass, this would be true if we created 
a class loader with the parent in the normal sense, but when we extend the 
classloader we keep the parent class loader as null and do our own resolution 
routing, so loadClass shouldn't be able to trigger a resolution on the parent 
since it has no knowledge of the parent.

> userClassPathFirst doesn't handle user classes inheriting from parent
> ---------------------------------------------------------------------
>
>                 Key: SPARK-4877
>                 URL: https://issues.apache.org/jira/browse/SPARK-4877
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Stephen Haberman
>
> We're trying out userClassPathFirst.
> To do so, we make an uberjar that does not contain Spark or Scala classes 
> (because we want those to load from the parent classloader, otherwise we'll 
> get errors like scala.Function0 != scala.Function0 since they'd load from 
> different class loaders).
> (Tangentially, some isolation classloaders like Jetty whitelist certain 
> packages, like spark/* and scala/*, to only come from the parent classloader, 
> so that technically if the user still messes up and leaks the Scala/Spark 
> jars into their uberjar, it won't blow up; this would be a good enhancement, 
> I think.)
> Anyway, we have a custom Kryo registrar, which ships in our uberjar, but 
> since it "extends spark.KryoRegistrator", which is not in our uberjar, we get 
> a ClassNotFoundException.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to