Hi,

looks like you're packaging your application for Scala 2.13 (should be
specified in your build.sbt) while your Spark installation is built for
Scala 2.12.

Go to https://spark.apache.org/downloads.html, select under "Choose a
package type" the package type that says "Scala 2.13". With that release
you should be able to run your application.

In general, minor versions of Scala (e.g. 2.12 and 2.13) are incompatible.

Best
Hannes


On Sun, Feb 6, 2022 at 10:01 AM <capitnfrak...@free.fr> wrote:

> Hello
>
>   I wrote this simple job in scala:
>
> $ cat Myjob.scala
> import org.apache.spark.sql.SparkSession
>
> object Myjob {
>    def main(args: Array[String]): Unit = {
>      val sparkSession = SparkSession.builder.appName("Simple
> Application").getOrCreate()
>      val sparkContext = sparkSession.sparkContext
>
>      val arrayRDD = sparkContext.parallelize(List(1,2,3,4,5,6,7,8))
>      println(arrayRDD.getClass, arrayRDD.count())
>    }
> }
>
>
> After package it then I submit it to spark, it gets the error:
>
> $ /opt/spark/bin/spark-submit --class "Myjob" --master local[4]
> target/scala-2.13/my-job_2.13-1.0.jar
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> 'scala.collection.immutable.ArraySeq
> scala.runtime.ScalaRunTime$.wrapIntArray(int[])'
>         at Myjob$.main(Myjob.scala:8)
>         at Myjob.main(Myjob.scala)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>         at
>
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>         at
> org.apache.spark.deploy.SparkSubmit.org
> $apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
>         at
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>         at
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>         at
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>         at
>
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
>         at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> What's the issue?
>
> Thank you.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to