[ 
https://issues.apache.org/jira/browse/SPARK-10910?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14941209#comment-14941209
 ] 

Thomas Graves commented on SPARK-10910:
---------------------------------------

So I understand that perhaps this mechanism won't work for it and its also 
marked experimental.   In this case spark.yarn.user.classpath.first worked 
because it puts it directly on the system classpath and  as Marcelo pointed 
out, extraClasspath probably would have worked also.

I think we should have a way for users to override libraries provided with 
Spark or shade them and make the user provide them.   I just want to make sure 
we don't deprecate a method of allowing them to in place of something that 
doesn't allow them to. 

> spark.{executor,driver}.userClassPathFirst don't work for kryo (probably 
> others)
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-10910
>                 URL: https://issues.apache.org/jira/browse/SPARK-10910
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, YARN
>    Affects Versions: 1.5.1
>            Reporter: Thomas Graves
>
> Trying to use spark.{executor,driver}.userClassPathFirst to put a newer 
> version of kryo in doesn't work.   Note I was running on YARN.
> There is a bug in kryo 1.21 that spark is using which is fixed in kryo 1.24.  
> A customer tried to use the spark.{executor,driver}.userClassPathFirst to 
> include the newer version of kryo but it threw the following exception:
> 15/09/29 21:36:43 ERROR yarn.ApplicationMaster: User class threw exception: 
> java.lang.LinkageError: loader constraint violation: loader (instance of 
> org/apache/spark/util/ChildFirstURLClassLoader) previously initiated loading 
> for a different type with name "com/esotericsoftware/kryo/Kryo"
> java.lang.LinkageError: loader constraint violation: loader (instance of 
> org/apache/spark/util/ChildFirstURLClassLoader) previously initiated loading 
> for a different type with name "com/esotericsoftware/kryo/Kryo"
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> The issue here is that the Spark Driver instantiates a kryo class in SparkEnv:
>  val serializer = instantiateClassFromConf[Serializer](
>       "spark.serializer", "org.apache.spark.serializer.JavaSerializer")
>     logDebug(s"Using serializer: ${serializer.getClass}")
> It uses whatever version is in the spark assembly jar.
> Then on YARN in the ApplicationMaster code before it starts the user 
> application is handles the user classpath first to be the 
> ChildFirstURLClassLoader, which is later used when kryo is needed. This tries 
> to load the newer version of kryo from the user jar and it throws the 
> exception above.
> I'm sure this could happen with any number of other classes that got loaded 
> by Spark before we try to run the user application code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to