[ https://issues.apache.org/jira/browse/SPARK-9441?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14646578#comment-14646578 ]
Sean Owen commented on SPARK-9441: ---------------------------------- Spark uses Akka which uses typesafe config 1.2.1, which contains this method, and matches what you're using. Before modifying your project, which you don't need to do, use {{mvn dependency:tree}} to verify what version of typesafe config you're getting, and from where. My guess is you're getting an older version from other dependency, and that's conflicting. There's a longer story here about Akka, and Typesafe Config, and shading, but if I'm right about that, then you just need to manage the version in your app in {{<dependencyManagement>}} and not write exclusions. I think this is less than ideal but is still 'by design' while Akka is around here, and, should be a reliable workaround if that's your app's situation. > NoSuchMethodError: Com.typesafe.config.Config.getDuration > --------------------------------------------------------- > > Key: SPARK-9441 > URL: https://issues.apache.org/jira/browse/SPARK-9441 > Project: Spark > Issue Type: Bug > Components: Deploy > Affects Versions: 1.3.1 > Reporter: nirav patel > > I recently migrated my spark based rest service from 1.0.2 to 1.3.1 > 15/07/29 10:31:12 INFO spark.SparkContext: Running Spark version 1.3.1 > 15/07/29 10:31:12 INFO spark.SecurityManager: Changing view acls to: npatel > 15/07/29 10:31:12 INFO spark.SecurityManager: Changing modify acls to: npatel > 15/07/29 10:31:12 INFO spark.SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view permissions: Set(npatel); users > with modify permissions: Set(npatel) > Exception in thread "main" java.lang.NoSuchMethodError: > com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J > at > akka.util.Helpers$ConfigOps$.akka$util$Helpers$ConfigOps$$getDuration$extension(Helpers.scala:125) > at akka.util.Helpers$ConfigOps$.getMillisDuration$extension(Helpers.scala:120) > at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171) > at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) > at > org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122) > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55) > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) > at > org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) > at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) > at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) > at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57) > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223) > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163) > at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:272) > I read on blogs where people suggest to modify classpath and put right > version before, put scala libs before in classpath and similar suggestions. > which is all ridiculous. I think typesafe config package included with > spark-core lib is incorrect. I did following with my maven build and now it > works. But i think someone need to fix spark-core package. > <dependency> > <groupId>org.apache.spark</groupId> > <artifactId>spark-core_2.10</artifactId> > <exclusions> > <exclusion> > <artifactId>config</artifactId> > <groupId>com.typesafe</groupId> > </exclusion> > </exclusions> > </dependency> > <dependency> > <groupId>com.typesafe</groupId> > <artifactId>config</artifactId> > <version>1.2.1</version> > </dependency> -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org