[ https://issues.apache.org/jira/browse/SPARK-26037?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Damian Momot resolved SPARK-26037. ---------------------------------- Resolution: Not A Problem > ./dev/make-distribution.sh fails for Scala 2.12 > ----------------------------------------------- > > Key: SPARK-26037 > URL: https://issues.apache.org/jira/browse/SPARK-26037 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 2.4.0 > Reporter: Damian Momot > Priority: Major > > Trying to create spark distribution for scala 2.12 and Spark 2.4.0 > First add cloudera repository: > {code:java} > <repository> > <id>cloudera</id> > <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> > </repository> > {code} > Try to make distribution: > {code:java} > ./dev/change-scala-version.sh 2.12 > ./dev/make-distribution.sh --name scala_2.12-cdh5.15.1 --tgz -Phadoop-2.6 > -Dhadoop.version=2.6.0-cdh5.15.1 -DskipTests -Phive -Phive-thriftserver > -Pyarn -Dscala-2.12{code} > I'm getting multiple failures in Spark Project Sketch - all look the same, > here's first for reference > {code:java} > ~/spark-2.4.0/common/sketch/src/test/scala/org/apache/spark/util/sketch/BitArraySuite.scala:32: > exception during macro expansion: > java.lang.ClassNotFoundException: scala.runtime.LazyRef > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > at > org.scalactic.MacroOwnerRepair$Utils.repairOwners(MacroOwnerRepair.scala:66) > at org.scalactic.MacroOwnerRepair.repairOwners(MacroOwnerRepair.scala:46) > at org.scalactic.BooleanMacro.genMacro(BooleanMacro.scala:837) > at org.scalatest.AssertionsMacro$.assert(AssertionsMacro.scala:34) > assert(new BitArray(64).bitSize() == 64) > {code} > Same commands for Scala 2.11 work fine -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org