Hi Fabrizio,

Spark 3.2.0 is supported recently in this PR
https://github.com/apache/zeppelin/pull/4257
The problem you mentioned is solved.

Fabrizio Fab <fabrizio.dagost...@tiscali.it> 于2021年10月28日周四 下午7:43写道:

> I am aware that Spark 3.20 is not officially released, but I am trying to
> put it to work.
>
> The first thing that I noticed is the following:
>
> the SparkInterpreter is compiled for Scala 2.12.7
>
> Spark 3.2 is compiled for Scala 2.12.15
>
> Unfortunately there are some breaking changes between the two versions
> (even if only the minor version has changed... W.T.F. ??)  that requires a
> recompiling (I hope no code update)..
>
> The first incompatibily I run into is at line 66 of
> SparkScala212Interpreter.scala
>     val settings = new Settings()
>     settings.processArguments(List("-Yrepl-class-based",
>       "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
>     settings.embeddedDefaults(sparkInterpreterClassLoader)
>
> -->    settings.usejavacp.value = true  <--
>
> scala.tools.nsc.Settings.usejavacp was moved since 2.12.13 from
> AbsSettings to MutableSettings, so you  get a runtime error.
>
>
> I'll make you know if I'll resolve all problems.
>
>
>

-- 
Best Regards

Jeff Zhang

Reply via email to