[ 
https://issues.apache.org/jira/browse/SPARK-1923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-1923:
-----------------------------------

    Description: 
I just wanted to document this for posterity. I had an issue when running a 
Spark 1.0 app locally with sbt. The issue was that if you both:

1. Reference a scala class (e.g. None) inside of a closure.
2. Run your program with 'sbt run'

It throws an exception. Upgrading the scalaVersion to 2.10.4 in sbt solved this 
issue. Somehow scala classes were not being loaded correctly inside of the 
executors:

Application:
{code}
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext

object Test {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local[4]").setAppName("Test")
    val sc = new SparkContext(conf)
    sc.makeRDD(1 to 1000, 10).map(x => Some(x)).count
    sc.stop()
  }
{code}

Exception:
{code}
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:1 
failed 1 times, most recent failure: Exception failure in TID 1 on host 
localhost: java.lang.ClassNotFoundException: scala.None$
        java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        java.security.AccessController.doPrivileged(Native Method)
        java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        java.lang.Class.forName0(Native Method)
        java.lang.Class.forName(Class.java:270)
        
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:60)
        java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
{code}


  was:
I just wanted to document this for posterity. I had an issue when running a 
Spark 1.0 app locally with sbt. Upgrading the scalaVersion to 2.10.4 in sbt 
solved this issue. Somehow scala classes were not being loaded correctly inside 
of the executors:

Application:
{code}
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext

object Test {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local[4]").setAppName("Test")
    val sc = new SparkContext(conf)
    sc.makeRDD(1 to 1000, 10).map(x => Some(x)).count
    sc.stop()
  }
{code}

Exception:
{code}
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:1 
failed 1 times, most recent failure: Exception failure in TID 1 on host 
localhost: java.lang.ClassNotFoundException: scala.None$
        java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        java.security.AccessController.doPrivileged(Native Method)
        java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        java.lang.Class.forName0(Native Method)
        java.lang.Class.forName(Class.java:270)
        
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:60)
        java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
{code}



> ClassNotFoundException when running with sbt and Scala 2.10.3
> -------------------------------------------------------------
>
>                 Key: SPARK-1923
>                 URL: https://issues.apache.org/jira/browse/SPARK-1923
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Patrick Wendell
>            Assignee: Patrick Wendell
>
> I just wanted to document this for posterity. I had an issue when running a 
> Spark 1.0 app locally with sbt. The issue was that if you both:
> 1. Reference a scala class (e.g. None) inside of a closure.
> 2. Run your program with 'sbt run'
> It throws an exception. Upgrading the scalaVersion to 2.10.4 in sbt solved 
> this issue. Somehow scala classes were not being loaded correctly inside of 
> the executors:
> Application:
> {code}
> import org.apache.spark.SparkConf
> import org.apache.spark.SparkContext
> object Test {
>   def main(args: Array[String]): Unit = {
>     val conf = new SparkConf().setMaster("local[4]").setAppName("Test")
>     val sc = new SparkContext(conf)
>     sc.makeRDD(1 to 1000, 10).map(x => Some(x)).count
>     sc.stop()
>   }
> {code}
> Exception:
> {code}
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:1 
> failed 1 times, most recent failure: Exception failure in TID 1 on host 
> localhost: java.lang.ClassNotFoundException: scala.None$
>         java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         java.security.AccessController.doPrivileged(Native Method)
>         java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>         java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>         java.lang.Class.forName0(Native Method)
>         java.lang.Class.forName(Class.java:270)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:60)
>         
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>         java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to