Have you looked through the logs fully? I have seen this (in my limited experience) pop up as a result of previous exceptions/errors, also as a result of being unable to serialize objects etc.
Ognen

On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
I notice that I get this error when I'm trying to load an objectFile with val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")


On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <jaon...@gmail.com <mailto:jaon...@gmail.com>> wrote:

    Here the output that I get :

    [error] (run-main-0) org.apache.spark.SparkException: Job aborted:
    Task 1.0:1 failed 4 times (most recent failure: Exception failure
    in TID 6 on host 172.166.86.36 <http://172.166.86.36>:
    java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
    org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4
    times (most recent failure: Exception failure in TID 6 on host
    172.166.86.36 <http://172.166.86.36>:
    java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
    at
    
org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
    at
    
org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
    at
    
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.scheduler.DAGScheduler.org
    
<http://org.apache.spark.scheduler.DAGScheduler.org>$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
    at
    
org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
    at
    
org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
    at scala.Option.foreach(Option.scala:236)
    at
    org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
    at
    
org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
    at akka.actor.ActorCell.invoke(ActorCell.scala:456)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at
    
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at
    scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at
    
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at
    scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at
    
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

    Spark says that the jar is added :

    14/03/26 15:49:18 INFO SparkContext: Added JAR
    target/scala-2.10/value-spark_2.10-1.0.jar





    On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
    <og...@plainvanillagames.com <mailto:og...@plainvanillagames.com>>
    wrote:

        Have you looked at the individual nodes logs? Can you post a
        bit more of the exception's output?


        On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:

            Hi all,

            I got java.lang.ClassNotFoundException even with "addJar"
            called. The jar file is present in each node.

            I use the version of spark from github master.

            Any ideas ?


            Jaonary


Reply via email to