Re: serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Qiuzhuang Lian
After I do a clean rebuild. It works now.

Thanks,
Qiuzhuang

On Sat, Oct 25, 2014 at 9:42 AM, Nan Zhu  wrote:

>  According to my experience, there are more issues rather than
> BlockManager when you try to run spark application whose build version is
> different with your cluster….
>
> I once tried to make jdbc server build with branch-jdbc-1.0 run with a
> branch-1.0 cluster…no workaround exits…just had to replace cluster jar with
> branch-jdbc-1.0 jar file…..
>
> Best,
>
> --
> Nan Zhu
>
> On Friday, October 24, 2014 at 9:23 PM, Josh Rosen wrote:
>
> Are all processes (Master, Worker, Executors, Driver) running the same
> Spark build?  This error implies that you’re seeing protocol / binary
> incompatibilities between your Spark driver and cluster.
>
> Spark is API-compatibile across the 1.x series, but we don’t make binary
> link-level compatibility guarantees:
> https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.
> This means that your Spark driver’s runtime classpath should use the same
> version of Spark that’s installed on your cluster.  You can compile against
> a different API-compatible version of Spark, but the runtime versions must
> match across all components.
>
> To fix this issue, I’d check that you’ve run the “package” and “assembly”
> phases and that your Spark cluster is using this updated version.
>
> - Josh
>
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (
> qiuzhuang.l...@gmail.com) wrote:
>
> Hi,
>
> I update git today and when connecting to spark cluster, I got
> the serialVersionUID incompatible error in class BlockManagerId.
>
> Here is the log,
>
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid
> this?
>
> Thanks,
> Qiuzhuang
>
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
> Remoting: org.apache.spark.storage.BlockManagerId; local class
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,
> local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:10:48 ERROR SparkDeploySchedul

Re: serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Nan Zhu
According to my experience, there are more issues rather than BlockManager when 
you try to run spark application whose build version is different with your 
cluster….  

I once tried to make jdbc server build with branch-jdbc-1.0 run with a 
branch-1.0 cluster…no workaround exits…just had to replace cluster jar with 
branch-jdbc-1.0 jar file…..

Best,  

--  
Nan Zhu


On Friday, October 24, 2014 at 9:23 PM, Josh Rosen wrote:

> Are all processes (Master, Worker, Executors, Driver) running the same Spark 
> build?  This error implies that you’re seeing protocol / binary 
> incompatibilities between your Spark driver and cluster.
>  
> Spark is API-compatibile across the 1.x series, but we don’t make binary 
> link-level compatibility guarantees: 
> https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.  
> This means that your Spark driver’s runtime classpath should use the same 
> version of Spark that’s installed on your cluster.  You can compile against a 
> different API-compatible version of Spark, but the runtime versions must 
> match across all components.
>  
> To fix this issue, I’d check that you’ve run the “package” and “assembly” 
> phases and that your Spark cluster is using this updated version.
>  
> - Josh
>  
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (qiuzhuang.l...@gmail.com 
> (mailto:qiuzhuang.l...@gmail.com)) wrote:
>  
> Hi,  
>  
> I update git today and when connecting to spark cluster, I got  
> the serialVersionUID incompatible error in class BlockManagerId.  
>  
> Here is the log,  
>  
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid  
> this?  
>  
> Thanks,  
> Qiuzhuang  
>  
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR  
> Remoting: org.apache.spark.storage.BlockManagerId; local class  
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,  
> local class serialVersionUID = 4657685702603429489  
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> at  
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
> at  
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
> at  
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at  
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
> at  
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
> at  
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
> at  
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
> at  
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>   
> at scala.util.Try$.apply(Try.scala:161)  
> at  
> akka.serialization.Serialization.deserialize(Serialization.scala:98)  
> at  
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
> at  
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
> at  
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937) 
>  
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
> at  
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>   
> at  
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>   
> at  
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
> at  
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>   
&

Re: serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Qiuzhuang Lian
I update git trunk and build in the two linux machines. I think they should
have the same version. I am going to do a force clean build and then retry.

Thanks.


On Sat, Oct 25, 2014 at 9:23 AM, Josh Rosen  wrote:

> Are all processes (Master, Worker, Executors, Driver) running the same
> Spark build?  This error implies that you’re seeing protocol / binary
> incompatibilities between your Spark driver and cluster.
>
> Spark is API-compatibile across the 1.x series, but we don’t make binary
> link-level compatibility guarantees:
> https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.
> This means that your Spark driver’s runtime classpath should use the same
> version of Spark that’s installed on your cluster.  You can *compile* against
> a different API-compatible version of Spark, but the runtime versions must
> match across all components.
>
> To fix this issue, I’d check that you’ve run the “package” and “assembly”
> phases and that your Spark cluster is using this updated version.
>
> - Josh
>
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (
> qiuzhuang.l...@gmail.com) wrote:
>
> Hi,
>
> I update git today and when connecting to spark cluster, I got
> the serialVersionUID incompatible error in class BlockManagerId.
>
> Here is the log,
>
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid
> this?
>
> Thanks,
> Qiuzhuang
>
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
> Remoting: org.apache.spark.storage.BlockManagerId; local class
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,
> local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 0014/10/25 09:11:21 ERROR Remoting:
> org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local 

Re: serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Josh Rosen
Are all processes (Master, Worker, Executors, Driver) running the same Spark 
build?  This error implies that you’re seeing protocol / binary 
incompatibilities between your Spark driver and cluster.

Spark is API-compatibile across the 1.x series, but we don’t make binary 
link-level compatibility guarantees: 
https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.  
This means that your Spark driver’s runtime classpath should use the same 
version of Spark that’s installed on your cluster.  You can compile against a 
different API-compatible version of Spark, but the runtime versions must match 
across all components.

To fix this issue, I’d check that you’ve run the “package” and “assembly” 
phases and that your Spark cluster is using this updated version.

- Josh

On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (qiuzhuang.l...@gmail.com) 
wrote:

Hi,  

I update git today and when connecting to spark cluster, I got  
the serialVersionUID incompatible error in class BlockManagerId.  

Here is the log,  

Shouldn't we better give BlockManagerId a constant serialVersionUID avoid  
this?  

Thanks,  
Qiuzhuang  

scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR  
Remoting: org.apache.spark.storage.BlockManagerId; local class  
incompatible: stream classdesc serialVersionUID = 2439208141545036836,  
local class serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
at  
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
at  
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
at  
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
at  
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
  
at scala.util.Try$.apply(Try.scala:161)  
at  
akka.serialization.Serialization.deserialize(Serialization.scala:98)  
at  
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
at  
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
at  
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
at  
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
  
at  
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
at  
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  
at  
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
at  
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
  
14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
0014/10/25 09:11:21 ERROR Remoting:  
org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
classdesc serialVersionUID = 2439208141545036836, local class  
serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:19

serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Qiuzhuang Lian
Hi,

I update git today and when connecting to spark cluster, I got
the serialVersionUID incompatible error in class BlockManagerId.

Here is  the log,

Shouldn't we better give BlockManagerId a constant serialVersionUID  avoid
this?

Thanks,
Qiuzhuang

scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
Remoting: org.apache.spark.storage.BlockManagerId; local class
incompatible: stream classdesc serialVersionUID = 2439208141545036836,
local class serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at
akka.serialization.Serialization.deserialize(Serialization.scala:98)
at
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
0014/10/25 09:11:21 ERROR Remoting:
org.apache.spark.storage.BlockManagerId; local class incompatible: stream
classdesc serialVersionUID = 2439208141545036836, local class
serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at
akka.serialization.Serialization.des