No, Spark is cross-built for 2.11 too, and those are the deps being
pulled in here. This really does however sounds like a Scala 2.10 vs
2.11 mismatch. Check that, for example, your cluster is using the same
build of Spark and that you did not package Spark with your app

On Thu, Mar 19, 2015 at 3:36 PM, Masf <masfwo...@gmail.com> wrote:
> Hi
>
> Spark 1.2.1 uses Scala 2.10. Because of this, your program fails with scala
> 2.11
>
> Regards
>
> On Thu, Mar 19, 2015 at 8:17 PM, Vijayasarathy Kannan <kvi...@vt.edu> wrote:
>>
>> My current simple.sbt is
>>
>> name := "SparkEpiFast"
>>
>> version := "1.0"
>>
>> scalaVersion := "2.11.4"
>>
>> libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.1" %
>> "provided"
>>
>> libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "1.2.1"
>> % "provided"
>>
>> While I do "sbt package", it compiles successfully. But while running the
>> application, I get
>> "Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;"
>>
>> However, changing the scala version to 2.10.4 and updating the dependency
>> lines appropriately resolves the issue (no exception).
>>
>> Could anyone please point out what I am doing wrong?
>
>
>
>
> --
>
>
> Saludos.
> Miguel Ángel

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to