[ https://issues.apache.org/jira/browse/SPARK-25282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599310#comment-16599310 ]
Yinan Li commented on SPARK-25282: ---------------------------------- I'm not sure this is a bug and how this should be enforced systematically. When you use the client mode and run the driver outside a cluster on a host, you are using the Spark distribution on the host, which may or may not have the same version as that of the Spark jars in the image. I guess this is not even a unique problem to Spark on Kubernetes. > Fix support for spark-shell with K8s > ------------------------------------ > > Key: SPARK-25282 > URL: https://issues.apache.org/jira/browse/SPARK-25282 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 2.4.0 > Reporter: Prashant Sharma > Priority: Major > > Spark shell when run with kubernetes master, gives following errors. > {noformat} > java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId; local > class incompatible: stream classdesc serialVersionUID = -3720498261147521051, > local class serialVersionUID = -6655865447853211720 > at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616) > at > java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781) > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018) > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942) > {noformat} > Special care was taken to ensure, the same compiled jar was used both in > images and the host system. or system running the driver. > This issue affects, pyspark and R interface as well. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org