Now I am running up against some other problem while trying to schedule tasks:

15/05/01 22:32:03 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalStateException: unread block data
    at 
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2419)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1380)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
    at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:180)
    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:724)


I verified that the same configuration works without using Kryo serialization.


On Fri, May 1, 2015 at 9:44 AM, Akshat Aranya <aara...@gmail.com> wrote:
> I cherry-picked the fix for SPARK-5470 and the problem has gone away.
>
> On Fri, May 1, 2015 at 9:15 AM, Akshat Aranya <aara...@gmail.com> wrote:
>> Yes, this class is present in the jar that was loaded in the classpath
>> of the executor Java process -- it wasn't even lazily added as a part
>> of the task execution.  Schema$MyRow is a protobuf-generated class.
>>
>> After doing some digging around, I think I might be hitting up against
>> SPARK-5470, the fix for which hasn't been merged into 1.2, as far as I
>> can tell.
>>
>> On Fri, May 1, 2015 at 9:05 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>> bq. Caused by: java.lang.ClassNotFoundException: com.example.Schema$MyRow
>>>
>>> So the above class is in the jar which was in the classpath ?
>>> Can you tell us a bit more about Schema$MyRow ?
>>>
>>> On Fri, May 1, 2015 at 8:05 AM, Akshat Aranya <aara...@gmail.com> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I'm getting a ClassNotFoundException at the executor when trying to
>>>> register a class for Kryo serialization:
>>>>
>>>> java.lang.reflect.InvocationTargetException
>>>>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>> Method)
>>>>       at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>       at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>       at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>       at org.apache.spark.SparkEnv$.instantiateClass$1(SparkEnv.scala:243)
>>>>       at
>>>> org.apache.spark.SparkEnv$.instantiateClassFromConf$1(SparkEnv.scala:254)
>>>>       at org.apache.spark.SparkEnv$.create(SparkEnv.scala:257)
>>>>       at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:182)
>>>>       at org.apache.spark.executor.Executor.<init>(Executor.scala:87)
>>>>       at
>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61)
>>>>       at
>>>> scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
>>>>       at
>>>> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
>>>>       at
>>>> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
>>>>       at
>>>> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
>>>>       at
>>>> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
>>>>       at
>>>> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
>>>>       at
>>>> org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
>>>>       at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>>>>       at
>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36)
>>>>       at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>>>>       at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>>>>       at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>>>>       at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>>>>       at
>>>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>>>>       at
>>>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>>>       at
>>>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>>>       at
>>>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>>>       at
>>>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>>> Caused by: org.apache.spark.SparkException: Failed to load class to
>>>> register with Kryo
>>>>       at
>>>> org.apache.spark.serializer.KryoSerializer$$anonfun$2.apply(KryoSerializer.scala:66)
>>>>       at
>>>> org.apache.spark.serializer.KryoSerializer$$anonfun$2.apply(KryoSerializer.scala:61)
>>>>       at
>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>>       at
>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>>       at
>>>> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>>>>       at
>>>> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>>>>       at
>>>> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>>>>       at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
>>>>       at
>>>> org.apache.spark.serializer.KryoSerializer.<init>(KryoSerializer.scala:61)
>>>>       ... 28 more
>>>> Caused by: java.lang.ClassNotFoundException: com.example.Schema$MyRow
>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>       at java.lang.Class.forName0(Native Method)
>>>>       at java.lang.Class.forName(Class.java:190)
>>>>       at
>>>> org.apache.spark.serializer.KryoSerializer$$anonfun$2.apply(KryoSerializer.scala:63)
>>>>
>>>> I have verified that when the executor process is launched, my jar is in
>>>> the classpath of the command line of the executor.  I expect the class to 
>>>> be
>>>> found by the default classloader being used at KryoSerializer.scala:63
>>>>
>>>> Any ideas?
>>>
>>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to