I see. The default scala version changed to 2.11 with Spark 2.0.0 afaik, so 
that's probably the version you get when downloading prepackaged binaries. Glad 
I could help ;)

Am 3. September 2016 23:59:51 MESZ, schrieb kant kodali <kanth...@gmail.com>:
>@Fridtjof you are right!
>changing it to this Fixed it!
>ompile group: org.apache.spark' name: 'spark-core_2.11' version:
>'2.0.0'
>compile group: 'org.apache.spark' name: 'spark-streaming_2.11' version:
>'2.0.0'
>
>
>
>
>
>
>
>On Sat, Sep 3, 2016 12:30 PM, kant kodali kanth...@gmail.com
>wrote:
>I increased the memory but nothing has changed I still get the same
>error.
>@Fridtjofon my driver side I am using the following dependenciescompile
>group:
>org.apache.spark' name: 'spark-core_2.10' version: '2.0.0'
>compile group: 'org.apache.spark' name: 'spark-streaming_2.10' version:
>'2.0.0'
>on the executor side I don't know what jars are being used but I have
>installed
>using this zip filespark-2.0.0-bin-hadoop2.7.tgz
> 
>
>
>
>
>
>On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander
>fridtjof.san...@googlemail.com
>wrote:
>There is an InvalidClassException complaining about non-matching
>serialVersionUIDs. Shouldn't that be caused by different jars on
>executors and
>driver?
>
>
>Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum"
><tal.grynb...@gmail.com>:
>My guess is that you're running out of memory somewhere.  Try to
>increase the
>driver memory and/or executor memory.   
>
>
>On Sat, Sep 3, 2016, 11:42 kant kodali <kanth...@gmail.com> wrote:
>I am running this on aws.
>
> 
>
>
>
>
>
>On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth...@gmail.com
>wrote:
>I am running spark in stand alone mode. I guess this error when I run
>my driver
>program..I am using spark 2.0.0. any idea what this error could be?
>
>
>Using Spark's default log4j profile:
>org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO
>SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN
>NativeCodeLoader: Unable to load native-hadoop library for your
>platform... using builtin-java classes where applicable16/09/02
>23:44:45 INFO SecurityManager: Changing view acls to:
>kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls
>to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view
>acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify
>acls groups to: 16/09/02 23:44:45 INFO SecurityManager:
>SecurityManager: authentication disabled; ui acls disabled; users  with
>view permissions: Set(kantkodali); groups with view permissions: Set();
>users  with modify permissions: Set(kantkodali); groups with modify
>permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started
>service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv:
>Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv:
>Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager:
>Created local directory at
>/private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02
>23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6
>MB16/09/02 23:44:45 INFO SparkEnv: Registering
>OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully
>started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI:
>Bound SparkUI to 0.0.0.0, and started at
>http://192.168.0.191:404016/09/02 23:44:45 INFO
>StandaloneAppClient$ClientEndpoint: Connecting to master
>spark://52.43.37.223:7077...16/09/02 23:44:46 INFO
>TransportClientFactory: Successfully created connection to
>/52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02
>23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to
>master 52.43.37.223:7077org.apache.spark.SparkException: Exception
>thrown in awaitResult    at
>org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
>at
>scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at
>org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at
>org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)   
>at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at
>org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
>at
>java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>at
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>at java.lang.Thread.run(Thread.java:745)Caused by:
>java.lang.RuntimeException: java.io.InvalidClassException:
>org.apache.spark.rpc.netty.RequestMessage; local class incompatible:
>stream classdesc serialVersionUID = -2221986757032131007, local class
>serialVersionUID = -5447855329526097695    at
>java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)   
>at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:

Reply via email to