That did resolve my issue.
Thanks a lot.
frakass
n 06/02/2022 17:25, Hannes Bibel wrote:
Hi,
looks like you're packaging your application for Scala 2.13 (should be
specified in your build.sbt) while your Spark installation is built
for Scala 2.12.
Go to
Hi,
looks like you're packaging your application for Scala 2.13 (should be
specified in your build.sbt) while your Spark installation is built for
Scala 2.12.
Go to https://spark.apache.org/downloads.html, select under "Choose a
package type" the package type that says "Scala 2.13". With that
Hello
I wrote this simple job in scala:
$ cat Myjob.scala
import org.apache.spark.sql.SparkSession
object Myjob {
def main(args: Array[String]): Unit = {
val sparkSession = SparkSession.builder.appName("Simple
Application").getOrCreate()
val sparkContext =