Re: ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-29 Thread Prem Spark
you need make sure this class is accessible to all servers since its a cluster mode and drive can be on any of the worker nodes. On Fri, Dec 25, 2015 at 5:57 PM, Saiph Kappa wrote: > Hi, > > I'm submitting a spark job like this: > >

Re: ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-29 Thread Saiph Kappa
I found out that by commenting this line in the application code: sparkConf.set("spark.executor.extraJavaOptions", " -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+AggressiveOpts -XX:FreqInlineSize=300 -XX:MaxInlineSize=300 ") the exception does not occur anymore. Not entirely sure why, but

ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-25 Thread Saiph Kappa
Hi, I'm submitting a spark job like this: ~/spark-1.5.2-bin-hadoop2.6/bin/spark-submit --class Benchmark --master > spark://machine1:6066 --deploy-mode cluster --jars > target/scala-2.10/benchmark-app_2.10-0.1-SNAPSHOT.jar > /home/user/bench/target/scala-2.10/benchmark-app_2.10-0.1-SNAPSHOT.jar