[ 
https://issues.apache.org/jira/browse/SPARK-34258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-34258.
-----------------------------------
    Resolution: Duplicate

This is fixed by SPARK-34505 via upgrading to Scala 2.13.5.

- https://github.com/scala/scala/commit/845cda33e16abbddd7af54030bc1ae73d00c2e34

> Upgrade Scala to avoid Scala's bug about illegal cyclic reference
> -----------------------------------------------------------------
>
>                 Key: SPARK-34258
>                 URL: https://issues.apache.org/jira/browse/SPARK-34258
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.2.0
>            Reporter: wuyi
>            Priority: Major
>
> There is a regression in Scala 2.13. We should probably upgrade maintenance 
> version of Scala 2.13 before Spark 3.2.0 release. Otherwise:
> {code}
> scalaVersion := "2.13.2"
> libraryDependencies ++= Seq(
>   "org.scala-lang" % "scala-reflect" % scalaVersion.value,
>   "com.google.protobuf" % "protobuf-java" % "3.11.4"
> )
> {code}
> {code}
> package foo
> import scala.reflect.runtime.universe
> import com.google.protobuf.DescriptorProtos.FileDescriptorProto
> object Hello {
>   def fromJava(t: FileDescriptorProto): Unit = {}
> }
> object Main {
>   def main(args: Array[String]): Unit = {
>     val mirror = universe.runtimeMirror(getClass.getClassLoader())
>     println(
>       mirror.reflectModule(
>         mirror.staticModule("foo.Hello$"))
>       .instance)
>   }
> {code}
> fails with the error below:
> {code}
> [info] running foo.Main
> [error] (run-main-b) scala.reflect.internal.Symbols$CyclicReference: illegal 
> cyclic reference involving type BuilderType
> {code}
> See https://github.com/scala/bug/issues/12038 for more details.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to