Hi
The compiled version (master side) and client version diverge on spark network 
JavaUtils. You should use the same/aligned version.
RegardsJB


Sent from my Samsung device

-------- Original message --------
From: Raghuveer Chanda <raghuveer.cha...@gmail.com> 
Date: 21/10/2015  12:33  (GMT+01:00) 
To: user@spark.apache.org 
Subject: Spark on Yarn 

Hi all,
I am trying to run spark on yarn in quickstart cloudera vm.It already has spark 
1.3 and Hadoop 2.6.0-cdh5.4.0 installed.(I am not using spark-submit since I 
want to run a different version of spark). I am able to run spark 1.3 on yarn 
but get the below error for spark 1.4.
The log shows its running on spark 1.4 but still gives a error on a method 
which is present in 1.4 and not 1.3. Even the fat jar contains the class files 
of 1.4.

As far as running in yarn the installed spark version shouldnt matter, but 
still its running on the other version.

Hadoop Version:Hadoop 2.6.0-cdh5.4.0Subversion 
http://github.com/cloudera/hadoop -r 
c788a14a5de9ecd968d1e2666e8765c5f018c271Compiled by jenkins on 
2015-04-21T19:18ZCompiled with protoc 2.5.0From source with checksum 
cd78f139c66c13ab5cee96e15a629025This command was run using 
/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.0.jar
Error:LogType:stderrLog Upload Time:Tue Oct 20 21:58:56 -0700 
2015LogLength:2334Log Contents:SLF4J: Class path contains multiple SLF4J 
bindings.SLF4J: Found binding in 
[jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
 Found binding in 
[jar:file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/10/simple-yarn-app-1.1.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
 See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.SLF4J: Actual binding is of type 
[org.slf4j.impl.Log4jLoggerFactory]15/10/20 21:58:50 INFO spark.SparkContext: 
Running Spark version 1.4.015/10/20 21:58:53 INFO spark.SecurityManager: 
Changing view acls to: yarn15/10/20 21:58:53 INFO spark.SecurityManager: 
Changing modify acls to: yarn15/10/20 21:58:53 INFO spark.SecurityManager: 
SecurityManager: authentication disabled; ui acls disabled; users with view 
permissions: Set(yarn); users with modify permissions: Set(yarn)Exception in 
thread "main" java.lang.NoSuchMethodError: 
org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J   
at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)   at 
org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)     at 
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) at 
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)     at 
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)    at 
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)       at 
org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)     at 
org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at 
org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)        at 
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)       at 
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at 
org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at 
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) at 
com.hortonworks.simpleyarnapp.HelloWorld.main(HelloWorld.java:50)15/10/20 
21:58:53 INFO util.Utils: Shutdown hook called
Please help :)

--Regards and Thanks,Raghuveer Chanda


Reply via email to