Spark 3 supports only Scala 2.12. This actually sounds like third party
library is compiled for 2.11 or something.

On Fri, Jun 5, 2020 at 11:11 PM charles_cai <1620075...@qq.com> wrote:

> Hi Pol,
>
> thanks for your suggestion, I am going to use Spark-3.0.0 for GPU
> acceleration,so I update the scala to the *version 2.12.11* and the latest
> *2.13* ,but the error is still there, and by the way , the Spark version is
> *spark-3.0.0-preview2-bin-without-hadoop*
>
> Caused by: java.lang.ClassNotFoundException: scala.Product$class
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>
> Charles cai
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to