[ https://issues.apache.org/jira/browse/SPARK-25369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16649495#comment-16649495 ]
Sean Owen commented on SPARK-25369: ----------------------------------- This won't work. The problem is that an API function accepting, say, {{java.util.function.Function}} is not {{Serializable}} . There are easy ways for the caller to force it to be so, but otherwise, callers will find their tasks will never serialize. I think we have to keep our interface functions, which all {{extend Serializable}}. The upside is that this is a good reason not to change this API at all. We can however retrofit the interfaces to also extend their Java counterparts, I believe. This might buy a little more interoperability for free. > Replace Java shim functional interfaces like java.api.Function with Java 8 > equivalents > -------------------------------------------------------------------------------------- > > Key: SPARK-25369 > URL: https://issues.apache.org/jira/browse/SPARK-25369 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.4.0 > Reporter: Sean Owen > Priority: Minor > Labels: release-notes > > In Spark 3, we should remove interfaces like > org.apache.spark.api.java.function.Function and replace with > java.util.function equivalents, for better compatibility with Java 8. This > would let callers pass, in more cases, an existing functional object in Java > rather than wrap in a lambda. > It's possible to have the functional interfaces in Spark just extend Java 8 > functional interfaces to interoperate better with existing code, but might be > as well to remove them in Spark 3 to clean up. > A partial list of transitions from Spark to Java interfaces: > * Function -> Function > * Function0 -> Supplier > * Function2 -> BiFunction > * VoidFunction -> Consumer > * FlatMapFunction etc -> extends Function<T,Iterable<R>> etc -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org