eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1659753792

   > change -target:jvm-1.8 to -release:8, both line 2911 and line 3652
   
   Hardcoding `-release:8` with the new default activation will not actually 
setting the `-release` config to 8. This is because the scala-maven-plugin will 
also append a `-release` flag based on the property "java.version". Since the 
one appended by scala-maven-plugin is later in the list of args it takes 
precedence.  So while yes doing like you suggest will compile it will not have 
created a java 8 release. The args can be seen by passing `-X` to maven
   
   ```
   $ ./build/mvn clean compile -Pscala-2.13 -X
   ...
   [DEBUG] [zinc] Running cached compiler 76b0ae1b for Scala compiler version 
2.13.11
   [DEBUG] [zinc] The Scala compiler is invoked with:
           -unchecked
           -deprecation
           -feature
           -explaintypes
           -release:8
           -Wconf:cat=deprecation:wv,any:e
           -Wunused:imports
           -Wconf:cat=scaladoc:wv
           -Wconf:cat=lint-multiarg-infix:wv
           -Wconf:cat=other-nullary-override:wv
           
-Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
           
-Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
           
-Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local
 OutputCommitCoordinatorSuite>.futureAction:wv
           
-Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since
 2.13).+$:s
           -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated 
because it loses precision).+$:s
           -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
           -Wconf:msg=method with a single empty parameter list overrides 
method without any parameter list:s
           -Wconf:msg=method without a parameter list overrides a method with a 
single empty one:s
           -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
           -Wconf:cat=unchecked&msg=outer reference:s
           -Wconf:cat=unchecked&msg=eliminated by erasure:s
           -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
           
-Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
           
-Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
           -Wconf:msg=Implicit definition should have explicit type:s
           -release
           17
           -bootclasspath
           
/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
           -classpath
           
/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-io/9.4.50.v20221201/jetty-io-9.4.50.v20221201.jar:/home/eejbyfeldt/.m2/repository/org/slf4j/slf4j-api/2.0.7/slf4j-api-2.0.7.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-client/9.4.51.v20230217/jetty-client-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-http/9.4.51.v20230217/jetty-http-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-util/9.4.51.v20230217/jetty-util-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/eejbyfeldt/dev/apache/spark/common/tags/target/scala-2.13/classes:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-reflect/2.13.11/scala-reflect-2.13.11.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-compiler/2.13.11/scala-compiler-2.13.11.jar:/home/eejbyfeldt/.m2/repository/io/github/java-diff-utils/java-diff-utils/4.12/java-diff-utils-4.12.jar:/home/eejbyfeld
 
t/.m2/repository/org/jline/jline/3.22.0/jline-3.22.0.jar:/home/eejbyfeldt/.m2/repository/net/java/dev/jna/jna/5.13.0/jna-5.13.0.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
   ...
   ```
   
   Running with `-Djava.version=8` will set to release to 8 properly and then 
compilation fails with:
   ```
   $ ./build/mvn clean compile -Pscala-2.13 -Djava.version=8
   ...
   [INFO] Compiler bridge file: 
/home/eejbyfeldt/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.8.0-bin_2.13.11__61.0-1.8.0_20221110T195421.jar
   [INFO] compiling 603 Scala sources and 77 Java sources to 
/home/eejbyfeldt/dev/apache/spark/core/target/scala-2.13/classes ...
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71:
 not found: value sun
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26:
 not found: object sun
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27:
 not found: object sun
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206:
 not found: type DirectBuffer
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210:
 not found: type Unsafe
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212:
 not found: type Unsafe
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213:
 not found: type DirectBuffer
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216:
 not found: type DirectBuffer
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236:
 not found: type DirectBuffer
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26:
 Unused import
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27:
 Unused import
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452:
 not found: value sun
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26:
 not found: object sun
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99:
 not found: type SignalHandler
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99:
 not found: type Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83:
 not found: type Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108:
 not found: type SignalHandler
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108:
 not found: value Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114:
 not found: type Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116:
 not found: value Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128:
 not found: value Signal
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26:
 Unused import
   [ERROR] [Error] 
/home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26:
 Unused import
   [ERROR] 23 errors found
   ```
   
   and based on the discussion in https://github.com/scala/bug/issues/12643 I 
belive this is the expected behavior.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to