Hi,
I'm getting the same error while manually setting up Spark cluster.
Has there been any update about this error?
Rgds
Niranda
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Invalid-Class-Exception-tp6859p13346.html
Sent from the Apache Spark
we experienced similar issue in our environment, below is the whole stack
trace, it works fine if we run local mode, if we run it in cluster mode
(even with Master and 1 worker on the same node), we have this
serialversionUID issue. we use Spark 1.0.0 and compiled with JDK6.
here is a link about
I tried building with Java 6 and also tried the pre-built packages. I am
still getting the same error.
It works fine when I run it on a machine with Solaris OS and X-86
architecture.
But, it does not work with Solaris OS and Sparc architecture.
Any ideas, why this would happen?
Thanks,
Su
I am building Spark by myself and I am using Java 7 to both build and run.
I will try with Java 6.
Thanks,
Suman.
On 6/3/2014 7:18 PM, Matei Zaharia wrote:
What Java version do you have, and how did you get Spark (did you build it
yourself by any chance or download a pre-built one)? If you bu
What Java version do you have, and how did you get Spark (did you build it
yourself by any chance or download a pre-built one)? If you build Spark
yourself you need to do it with Java 6 — it’s a known issue because of the way
Java 6 and 7 package JAR files. But I haven’t seen it result in this p
Hi all,
I get the following exception when using Spark to run example k-means
program. I am using Spark 1.0.0 and running the program locally.
java.io.InvalidClassException: scala.Tuple2; invalid descriptor for field _1
at
java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.jav
On 5/27/2014 1:28 PM, Marcelo Vanzin wrote:
On Tue, May 27, 2014 at 1:05 PM, Suman Somasundar
wrote:
I am running this on a Solaris machine with logical partitions. All the
partitions (workers) access the same Spark folder.
Can you check whether you have multiple versions of the offending
cla
On Tue, May 27, 2014 at 1:05 PM, Suman Somasundar
wrote:
> I am running this on a Solaris machine with logical partitions. All the
> partitions (workers) access the same Spark folder.
Can you check whether you have multiple versions of the offending
class (org.apache.spark.SerializableWritable) i
I am running this on a Solaris machine with logical partitions. All the
partitions (workers) access the same Spark folder.
Thanks,
Suman.
On 5/23/2014 9:44 PM, Andrew Or wrote:
That means not all of your driver and executors have the same version
of Spark. Are you on a standalone EC2 cluster
That means not all of your driver and executors have the same version of
Spark. Are you on a standalone EC2 cluster? If so, one way to fix this is
to run the following on the master node:
/root/spark-ec2/copy-dir --delete /root/spark
This syncs all of Spark across your cluster, configs, jars and
Hi,
I get the following exception when using Spark to run various programs.
java.io.InvalidClassException: org.apache.spark.SerializableWritable;
local class incompatible: stream classdesc serialVersionUID =
6301214776158303468, local class serialVersionUID = -7785455416944904980
at
j
11 matches
Mail list logo