[ 
https://issues.apache.org/jira/browse/SPARK-19675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15876854#comment-15876854
 ] 

Shixiong Zhu commented on SPARK-19675:
--------------------------------------

[~taroplus] If I understand correctly, SBT launches your application in a new 
JVM process *without any SBT codes*. But your Spark applications requires to 
run Spark codes which is prebuilt with a specific Scala versions *before 
creating ExecutorClassLoader*. That's totally different.

In your example, `Class.forName("scala.Option", false, executorLoader)` doesn't 
call `ExecutorClassLoader.findClass` because "scala.Option" is already loaded.

{code}
scala> :paste
// Entering paste mode (ctrl-D to finish)

import java.net._
import org.apache.spark._
import org.apache.spark.repl._

val desiredLoader = new URLClassLoader(
           Array(new URL("file:/tmp/scala-library-2.11.0.jar")),
           null)

val executorLoader = new ExecutorClassLoader(
      new SparkConf(),
      null,
      "",
      desiredLoader,
      false) {
  override def findClass(name: String): Class[_] = {
      println("finding class: " + name)
      super.findClass(name)
  }
}

Class.forName("scala.Option", false, executorLoader).getClassLoader()

// Exiting paste mode, now interpreting.

import java.net._
import org.apache.spark._
import org.apache.spark.repl._
desiredLoader: java.net.URLClassLoader = java.net.URLClassLoader@37f41a81
executorLoader: org.apache.spark.repl.ExecutorClassLoader = $anon$1@1c3d9e28
res0: ClassLoader = sun.misc.Launcher$AppClassLoader@4d76f3f8

scala> 
{code}

In the above example, you can see `findClass` is not called.


> ExecutorClassLoader loads classes from SystemClassLoader
> --------------------------------------------------------
>
>                 Key: SPARK-19675
>                 URL: https://issues.apache.org/jira/browse/SPARK-19675
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0, 2.2.0
>         Environment: sbt / Play Framework
>            Reporter: Kohki Nishio
>            Priority: Minor
>
> Spark Executor loads classes from SystemClassLoader which contains 
> sbt-launch.jar and it contains Scala2.10 binary, however Spark itself is 
> built on Scala2.11, thus it's throwing InvalidClassException
> java.io.InvalidClassException: scala.Option; local class incompatible: stream 
> classdesc serialVersionUID = -114498752079829388, local class 
> serialVersionUID = 5081326844987135632
>       at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> ExecutorClassLoader's desired class loder (parentLoader) actually contains 
> the correct path (scala-library-2.11.8.jar) but it is not being used.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to