Thanks Ted. I haven't explicitly specified Scala (I tried different versions in pom.xml as well).
For what it is worth, this is what I get when I do a maven dependency tree. I wonder if the 2.11.2 coming from scala-reflect matters: [INFO] | | \- org.scala-lang:scalap:jar:2.11.0:compile [INFO] | | \- org.scala-lang:scala-compiler:jar:2.11.0:compile [INFO] | | +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.1:compile [INFO] | | \- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.1:compile [INFO] | +- com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.4.4:compile [INFO] | | +- org.scala-lang:scala-reflect:jar:2.11.2:compile [INFO] \- org.scala-lang:scala-library:jar:2.11.0:compile On Fri, Mar 11, 2016 at 10:38 AM, Ted Yu <yuzhih...@gmail.com> wrote: > Looks like Scala version mismatch. > > Are you using 2.11 everywhere ? > > On Fri, Mar 11, 2016 at 10:33 AM, vasu20 <vas...@gmail.com> wrote: > >> Hi >> >> Any help appreciated on this. I am trying to write a Spark program using >> IntelliJ. I get a run time error as soon as new SparkConf() is called >> from >> main. Top few lines of the exception are pasted below. >> >> These are the following versions: >> >> Spark jar: spark-assembly-1.6.0-hadoop2.6.0.jar >> pom: <artifactId>spark-core_2.11</artifactId> >> <version>1.6.0</version> >> >> I have installed the Scala plugin in IntelliJ and added a dependency. >> >> I have also added a library dependency in the project structure. >> >> Thanks for any help! >> >> Vasu >> >> >> Exception in thread "main" java.lang.NoSuchMethodError: >> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String; >> at org.apache.spark.util.Utils$.<init>(Utils.scala:1682) >> at org.apache.spark.util.Utils$.<clinit>(Utils.scala) >> at org.apache.spark.SparkConf.<init>(SparkConf.scala:59) >> >> >> >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >