Fwd: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Babar Tareen
Resending, haven't found a workaround. Any help is highly appreciated.

-- Forwarded message --
From: Babar Tareen <babartar...@gmail.com>
Date: Thu, Oct 22, 2015 at 2:47 PM
Subject: Getting ClassNotFoundException: scala.Some on Spark 1.5.x
To: user@spark.apache.org


Hi,

I am getting following exception when submitting a job to Spark 1.5.x from
Scala. The same code works with Spark 1.4.1. Any clues as to what might
causing the exception.



*Code:App.scala*import org.apache.spark.SparkContext

object App {
  def main(args: Array[String]) = {
val l = List(1,2,3,4,5,6,7,8,9,0)
val sc = new SparkContext("local[4]", "soark-test")
val rdd = sc.parallelize(l)
rdd.foreach(println)
println(rdd.collect())
  }
}

*build.sbt*
lazy val sparkjob = (project in file("."))
  .settings(
name := "SparkJob",
version := "1.0",
scalaVersion := "2.11.6",
libraryDependencies := libs
)

lazy val libs = Seq(
  "org.apache.spark" %% "spark-core" % "1.5.1"
)


*Exception:*15/10/22 14:32:42 INFO DAGScheduler: Job 0 failed: foreach at
app.scala:9, took 0.689832 s
[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to
stage failure: Task 2 in stage 0.0 failed 1 times, most recent failure:
Lost task 2.0 in stage 0.0 (TID 2, localhost): java.io.IOException:
java.lang.ClassNotFoundException: scala.Some
[error] at
org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
[error] at
org.apache.spark.Accumulable.readObject(Accumulators.scala:151)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:497)
[error] at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
[error] at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
[error] at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
[error] at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
[error] at
java.io.ObjectInputStream.skipCustomData(ObjectInputStream.java:1959)
[error] at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
[error] at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
[error] at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
[error] at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
[error] at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
[error] at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
[error] at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
[error] at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
[error] at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
[error] at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
[error] at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
[error] at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] Caused by: java.lang.ClassNotFoundException: scala.Some
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[error] at java.lang.Class.forName0(Native Method)
[error] at java.lang.Class.forName(Class.java:348)
[error] at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
[error] at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
[error] at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
[error] at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
[error] at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
[error] at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
[error] at
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:501)
[error] at
org.apache.spark.Accumulable$$anonfun$readObject$1.apply$mcV$sp(Accumulators.scala:152)
[error] at
org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1160)
[error] ... 24 more
[error]
[error] Driver stacktrace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2
in stage 0.0 failed 1 times, most recent failure: Lost task 2.0 in

Re: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Babar Tareen
I am using *'sbt run'* to execute the code. Detailed sbt output is here (
https://drive.google.com/open?id=0B2dlA_DzEohVakpValRjRS1zVG8).

I had scala 2.11.7 installed on my machine. But even after uninstalling it,
I am still getting the exception with 2.11.6.

Changing the scala version to 2.11.7 in build.sbt fixes the exception as
you suggested. I am unclear as to why it works with 2.11.7 and not 2.11.6.

Thanks,
Babar

On Mon, Nov 2, 2015 at 2:10 PM Jonathan Coveney <jcove...@gmail.com> wrote:

> Caused by: java.lang.ClassNotFoundException: scala.Some
>
> indicates that you don't have the scala libs present. How are you
> executing this? My guess is the issue is a conflict between scala 2.11.6 in
> your build and 2.11.7? Not sure...try setting your scala to 2.11.7?
>
> But really, first it'd be good to see what command you're using to invoke
> this.
>
> 2015-11-02 14:48 GMT-05:00 Babar Tareen <babartar...@gmail.com>:
>
>> Resending, haven't found a workaround. Any help is highly appreciated.
>>
>> -- Forwarded message --
>> From: Babar Tareen <babartar...@gmail.com>
>> Date: Thu, Oct 22, 2015 at 2:47 PM
>> Subject: Getting ClassNotFoundException: scala.Some on Spark 1.5.x
>> To: user@spark.apache.org
>>
>>
>> Hi,
>>
>> I am getting following exception when submitting a job to Spark 1.5.x
>> from Scala. The same code works with Spark 1.4.1. Any clues as to what
>> might causing the exception.
>>
>>
>>
>> *Code:App.scala*import org.apache.spark.SparkContext
>>
>> object App {
>>   def main(args: Array[String]) = {
>> val l = List(1,2,3,4,5,6,7,8,9,0)
>> val sc = new SparkContext("local[4]", "soark-test")
>> val rdd = sc.parallelize(l)
>> rdd.foreach(println)
>> println(rdd.collect())
>>   }
>> }
>>
>> *build.sbt*
>> lazy val sparkjob = (project in file("."))
>>   .settings(
>> name := "SparkJob",
>> version := "1.0",
>> scalaVersion := "2.11.6",
>> libraryDependencies := libs
>> )
>>
>> lazy val libs = Seq(
>>   "org.apache.spark" %% "spark-core" % "1.5.1"
>> )
>>
>>
>> *Exception:*15/10/22 14:32:42 INFO DAGScheduler: Job 0 failed: foreach
>> at app.scala:9, took 0.689832 s
>> [error] (run-main-0) org.apache.spark.SparkException: Job aborted due to
>> stage failure: Task 2 in stage 0.0 failed 1 times, most recent failure:
>> Lost task 2.0 in stage 0.0 (TID 2, localhost): java.io.IOException:
>> java.lang.ClassNotFoundException: scala.Some
>> [error] at
>> org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
>> [error] at
>> org.apache.spark.Accumulable.readObject(Accumulators.scala:151)
>> [error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> [error] at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> [error] at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> [error] at java.lang.reflect.Method.invoke(Method.java:497)
>> [error] at
>> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>> [error] at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
>> [error] at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>> [error] at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>> [error] at
>> java.io.ObjectInputStream.skipCustomData(ObjectInputStream.java:1959)
>> [error] at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>> [error] at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>> [error] at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>> [error] at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>> [error] at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>> [error] at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>> [error] at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>> [error] at
>> java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>> [error] at
>> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>> [error] at
>> org.apache.spark.serializer.JavaSerializerInstan

Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-10-22 Thread Babar Tareen
e Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.skipCustomData(ObjectInputStream.java:1959)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks,
Babar Tareen