There is an InvalidClassException complaining about non-matching serialVersionUIDs. Shouldn't that be caused by different jars on executors and driver?
Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum" <tal.grynb...@gmail.com>: > My guess is that you're running out of memory somewhere. Try to increase > the driver memory and/or executor memory. > > On Sat, Sep 3, 2016, 11:42 kant kodali <kanth...@gmail.com> wrote: > >> I am running this on aws. >> >> >> >> On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth...@gmail.com wrote: >> >>> I am running spark in stand alone mode. I guess this error when I run my >>> driver program..I am using spark 2.0.0. any idea what this error could be? >>> >>> >>> >>> Using Spark's default log4j profile: >>> org/apache/spark/log4j-defaults.properties >>> 16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.0 >>> 16/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop >>> library for your platform... using builtin-java classes where applicable >>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali >>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali >>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to: >>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to: >>> 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication >>> disabled; ui acls disabled; users with view permissions: Set(kantkodali); >>> groups with view permissions: Set(); users with modify permissions: >>> Set(kantkodali); groups with modify permissions: Set() >>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on >>> port 62256. >>> 16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker >>> 16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster >>> 16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at >>> /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc >>> 16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity >>> 2004.6 MB >>> 16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator >>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on >>> port 4040. >>> 16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at >>> http://192.168.0.191:4040 >>> 16/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to >>> master spark://52.43.37.223:7077... >>> 16/09/02 23:44:46 INFO TransportClientFactory: Successfully created >>> connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps) >>> 16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to >>> connect to master 52.43.37.223:7077 >>> org.apache.spark.SparkException: Exception thrown in awaitResult >>> at >>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) >>> at >>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) >>> at >>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) >>> at >>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) >>> at >>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) >>> at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162) >>> at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) >>> at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) >>> at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) >>> at >>> org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109) >>> at >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:266) >>> at >>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) >>> at >>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) >>> at java.lang.Thread.run(Thread.java:745) >>> Caused by: java.lang.RuntimeException: java.io.InvalidClassException: >>> org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream >>> classdesc serialVersionUID = -2221986757032131007, local class >>> serialVersionUID = -5447855329526097695 >>> at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616) >>> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java: >>> >>>