What version of java?

On Feb 1, 2018 11:30 AM, "Mihai Iacob" <mia...@ca.ibm.com> wrote:

> I am setting up a spark 2.2.1 cluster, however, when I bring up the master
> and workers (both on spark 2.2.1) I get this error. I tried spark 2.2.0 and
> get the same error. It works fine on spark 2.0.2. Have you seen this
> before, any idea what's wrong?
>
> I found this, but it's in a different situation: https://github.com/
> apache/spark/pull/19802
>
>
> 18/02/01 05:07:22 ERROR Utils: Exception encountered
>
> java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local
> class incompatible: stream classdesc serialVersionUID =
> -1223633663228316618, local class serialVersionUID = 1835832137613908542
>
>         at java.io.ObjectStreamClass.initNonProxy(
> ObjectStreamClass.java:687)
>
>         at java.io.ObjectInputStream.readNonProxyDesc(
> ObjectInputStream.java:1885)
>
>         at java.io.ObjectInputStream.readClassDesc(
> ObjectInputStream.java:1751)
>
>         at java.io.ObjectInputStream.readNonProxyDesc(
> ObjectInputStream.java:1885)
>
>         at java.io.ObjectInputStream.readClassDesc(
> ObjectInputStream.java:1751)
>
>         at java.io.ObjectInputStream.readOrdinaryObject(
> ObjectInputStream.java:2042)
>
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.
> java:1573)
>
>         at java.io.ObjectInputStream.defaultReadFields(
> ObjectInputStream.java:2287)
>
>         at java.io.ObjectInputStream.defaultReadObject(
> ObjectInputStream.java:563)
>
>         at org.apache.spark.deploy.master.WorkerInfo$$anonfun$
> readObject$1.apply$mcV$sp(WorkerInfo.scala:52)
>
>         at org.apache.spark.deploy.master.WorkerInfo$$anonfun$
> readObject$1.apply(WorkerInfo.scala:51)
>
>         at org.apache.spark.deploy.master.WorkerInfo$$anonfun$
> readObject$1.apply(WorkerInfo.scala:51)
>
>         at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1303)
>
>         at org.apache.spark.deploy.master.WorkerInfo.readObject(
> WorkerInfo.scala:51)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
>
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:498)
>
>         at java.io.ObjectStreamClass.invokeReadObject(
> ObjectStreamClass.java:1158)
>
>         at java.io.ObjectInputStream.readSerialData(
> ObjectInputStream.java:2178)
>
>         at java.io.ObjectInputStream.readOrdinaryObject(
> ObjectInputStream.java:2069)
>
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.
> java:1573)
>
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.
> java:433)
>
>         at org.apache.spark.serializer.JavaDeserializationStream.
> readObject(JavaSerializer.scala:75)
>
>         at org.apache.spark.deploy.master.FileSystemPersistenceEngine.org
> $apache$spark$deploy$master$FileSystemPersistenceEngine$$
> deserializeFromFile(FileSystemPersistenceEngine.scala:80)
>
>         at org.apache.spark.deploy.master.FileSystemPersistenceEngine$$
> anonfun$read$1.apply(FileSystemPersistenceEngine.scala:56)
>
>         at org.apache.spark.deploy.master.FileSystemPersistenceEngine$$
> anonfun$read$1.apply(FileSystemPersistenceEngine.scala:56)
>
>         at scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:234)
>
>         at scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:234)
>
>         at scala.collection.IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized.scala:33)
>
>         at scala.collection.mutable.ArrayOps$ofRef.foreach(
> ArrayOps.scala:186)
>
>         at scala.collection.TraversableLike$class.map(
> TraversableLike.scala:234)
>
>         at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
>
>         at org.apache.spark.deploy.master.FileSystemPersistenceEngine.
> read(FileSystemPersistenceEngine.scala:56)
>
>         at org.apache.spark.deploy.master.PersistenceEngine$$
> anonfun$readPersistedData$1.apply(PersistenceEngine.scala:87)
>
>         at org.apache.spark.deploy.master.PersistenceEngine$$
> anonfun$readPersistedData$1.apply(PersistenceEngine.scala:86)
>
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
>
>         at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(
> NettyRpcEnv.scala:316)
>
>        packet_write_wait: Connection to 9.30.118.193 port 22: Broken
> pipeData(PersistenceEngine.scala:86)
> ​​​​​​​
>
>
>
> Regards,
>
> *Mihai Iacob*
> DSX Local <https://datascience.ibm.com/local> - Security, IBM Analytics
>
> --------------------------------------------------------------------- To
> unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to