[ 
https://issues.apache.org/jira/browse/SPARK-19675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15877256#comment-15877256
 ] 

Kohki Nishio edited comment on SPARK-19675 at 2/22/17 1:54 AM:
---------------------------------------------------------------

My application does notebook thing, this is about Zeppelin could not load 
resource

https://issues.apache.org/jira/browse/SPARK-11818

Also you can see the same error message in a different notebook project here

https://github.com/andypetrella/spark-notebook/issues/346

I would say this is not really a strange use case especially in a notebook 
situation where REPL does its own class loader thing. This is an issue, 
actually it is blocking me and there's no solution.

why you think this is going to break Yarn situation ? loadClass can try its 
class loader first then do the original, so given class loader first is a right 
thing to do.

{code}
try { parentLoader.loadClass(name) }
catch { super.loadClass(name) }
{code}


was (Author: taroplus):
My application does notebook thing, this is about Zeppelin could not load 
resource

https://issues.apache.org/jira/browse/SPARK-11818

Also you can see the same error message in a notebook project here

https://github.com/andypetrella/spark-notebook/issues/346

I would say this is not really a strange use case especially in a notebook 
situation where REPL does its own class loader thing. This is an issue, 
actually it is blocking me and there's no solution.

why you think this is going to break Yarn situation ? loadClass can try its 
class loader first then do the original, so given class loader first is a right 
thing to do.

{code}
try { parentLoader.loadClass(name) }
catch { super.loadClass(name) }
{code}

> ExecutorClassLoader loads classes from SystemClassLoader
> --------------------------------------------------------
>
>                 Key: SPARK-19675
>                 URL: https://issues.apache.org/jira/browse/SPARK-19675
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.1.0, 2.2.0
>         Environment: sbt / Play Framework
>            Reporter: Kohki Nishio
>            Priority: Minor
>
> Spark Executor loads classes from SystemClassLoader which contains 
> sbt-launch.jar and it contains Scala2.10 binary, however Spark itself is 
> built on Scala2.11, thus it's throwing InvalidClassException
> java.io.InvalidClassException: scala.Option; local class incompatible: stream 
> classdesc serialVersionUID = -114498752079829388, local class 
> serialVersionUID = 5081326844987135632
>       at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> ExecutorClassLoader's desired class loder (parentLoader) actually contains 
> the correct path (scala-library-2.11.8.jar) but it is not being used.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to