[ 
https://issues.apache.org/jira/browse/SPARK-10910?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14941175#comment-14941175
 ] 

Sean Owen commented on SPARK-10910:
-----------------------------------

I suspect this is one of several libraries that simply can't be overridden this 
way, because of the way they are used internally in Spark. There is a 
classloader problem no matter which way you turn. I can't say I know there's no 
way to make it work, but my expectation is that this would not work.

> spark.{executor,driver}.userClassPathFirst don't work for kryo (probably 
> others)
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-10910
>                 URL: https://issues.apache.org/jira/browse/SPARK-10910
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, YARN
>    Affects Versions: 1.5.1
>            Reporter: Thomas Graves
>
> Trying to use spark.{executor,driver}.userClassPathFirst to put a newer 
> version of kryo in doesn't work.   Note I was running on YARN.
> There is a bug in kryo 1.21 that spark is using which is fixed in kryo 1.24.  
> A customer tried to use the spark.{executor,driver}.userClassPathFirst to 
> include the newer version of kryo but it threw the following exception:
> 15/09/29 21:36:43 ERROR yarn.ApplicationMaster: User class threw exception: 
> java.lang.LinkageError: loader constraint violation: loader (instance of 
> org/apache/spark/util/ChildFirstURLClassLoader) previously initiated loading 
> for a different type with name "com/esotericsoftware/kryo/Kryo"
> java.lang.LinkageError: loader constraint violation: loader (instance of 
> org/apache/spark/util/ChildFirstURLClassLoader) previously initiated loading 
> for a different type with name "com/esotericsoftware/kryo/Kryo"
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> The issue here is that the Spark Driver instantiates a kryo class in SparkEnv:
>  val serializer = instantiateClassFromConf[Serializer](
>       "spark.serializer", "org.apache.spark.serializer.JavaSerializer")
>     logDebug(s"Using serializer: ${serializer.getClass}")
> It uses whatever version is in the spark assembly jar.
> Then on YARN in the ApplicationMaster code before it starts the user 
> application is handles the user classpath first to be the 
> ChildFirstURLClassLoader, which is later used when kryo is needed. This tries 
> to load the newer version of kryo from the user jar and it throws the 
> exception above.
> I'm sure this could happen with any number of other classes that got loaded 
> by Spark before we try to run the user application code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to