eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1658336924

   > I think the issue is you will target Java 17 bytecode if running on 17, 
when we want to target 8 in all cases
   
   If that is the case then the changes currently in this PR are not what we 
want. But are we really sure that this is something that is expected or used? 
Because as far as I can tell this is not something that actually worked in the 
passed. If I take a spark 3.4.1 build that I build using the v3.4.1 tag on Java 
11 and then try to run `spark-submit run-example SparkPi` on Java 8 it fails 
with
   ```
   2023-07-31 14:59:15,304 INFO scheduler.DAGScheduler: ResultStage 0 (reduce 
at SparkPi.scala:38) failed in 0.259 s due to Job aborted due to stage failure: 
Task serialization failed: java.lang.NoSuchMethodError: 
java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
   java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
        at 
org.apache.spark.util.io.ChunkedByteBufferOutputStream.toChunkedByteBuffer(ChunkedByteBufferOutputStream.scala:115)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:362)
        at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:160)
        at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:99)
        at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:38)
        at 
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:78)
        at 
org.apache.spark.SparkContext.broadcastInternal(SparkContext.scala:1548)
        at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1530)
        at 
org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1535)
        at 
org.apache.spark.scheduler.DAGScheduler.submitStage(DAGScheduler.scala:1353)
        at 
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1295)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2931)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2923)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2912)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
   ```
   
   I think the problem here boils down to that we previously have used the 
scalac arg `--target` to attempt to achieve what you describe. But according to 
the comment here 
(https://github.com/scala/bug/issues/12643#issuecomment-1253761646) 
   
   > -target says "emit class file of version N, but I want to use arbitrary 
classes from the JDK and take my chances".
   
   so only specifying `-target` is not the proper way to build on a later Java 
version and target Java 8. My understanding is that if that is what we actually 
want then we would need specify the java version using `-release` and actually 
fix the build errors that it causes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to