[ 
https://issues.apache.org/jira/browse/SPARK-4877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14288945#comment-14288945
 ] 

Stephen Haberman commented on SPARK-4877:
-----------------------------------------

Hi Matt,

I don't doubt you are right, but can you clarify?

I had noticed that the Jetty/Hadoop implementations do overload loadClass, but 
I was going for a minimal amount of changes for this initial PR. It also passes 
the tests and works for us in production, so AFAICT was okay.

I think more fully porting the Jetty/Hadoop code over is a good idea, I was 
just assuming moving from findClass to loadClass wouldn't be required until 
taking that on.

> userClassPathFirst doesn't handle user classes inheriting from parent
> ---------------------------------------------------------------------
>
>                 Key: SPARK-4877
>                 URL: https://issues.apache.org/jira/browse/SPARK-4877
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Stephen Haberman
>
> We're trying out userClassPathFirst.
> To do so, we make an uberjar that does not contain Spark or Scala classes 
> (because we want those to load from the parent classloader, otherwise we'll 
> get errors like scala.Function0 != scala.Function0 since they'd load from 
> different class loaders).
> (Tangentially, some isolation classloaders like Jetty whitelist certain 
> packages, like spark/* and scala/*, to only come from the parent classloader, 
> so that technically if the user still messes up and leaks the Scala/Spark 
> jars into their uberjar, it won't blow up; this would be a good enhancement, 
> I think.)
> Anyway, we have a custom Kryo registrar, which ships in our uberjar, but 
> since it "extends spark.KryoRegistrator", which is not in our uberjar, we get 
> a ClassNotFoundException.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to