Looks like Scala version mismatch.

Are you using 2.11 everywhere ?

On Fri, Mar 11, 2016 at 10:33 AM, vasu20 <vas...@gmail.com> wrote:

> Hi
>
> Any help appreciated on this.  I am trying to write a Spark program using
> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
> main.  Top few lines of the exception are pasted below.
>
> These are the following versions:
>
> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
> pom:  <artifactId>spark-core_2.11</artifactId>
>          <version>1.6.0</version>
>
> I have installed the Scala plugin in IntelliJ and added a dependency.
>
> I have also added a library dependency in the project structure.
>
> Thanks for any help!
>
> Vasu
>
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>         at org.apache.spark.util.Utils$.<init>(Utils.scala:1682)
>         at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
>         at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to