[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-4819:
-----------------------------
    Assignee: Sean Owen  (was: Reynold Xin)

I'm going to give this a try. The game plan is roughly to replace Optional<T> 
with T, where the value can be null. This is the closest equivalent in the Java 
7 JDK. Of course, Java 8 would be nice here too, and would be pleased to see 
deciding to require Java 8 for 2.x, but that's a different question.

One catch: It's simple to use {{v.orNull}} on an instance of Scala's 
{{Option[V]}} to get the value or null, for use with Java. It's like the mirror 
to {{Option(v)}}. However this only works if Scala knows type V is a reference 
type. So this means adding ">: Null" bounds to a lot of types in the Java API. 
Which isn't a bad idea, in that Java RDDs can only contain objects and not 
primitives anyway.

Comments?

> Remove Guava's "Optional" from public API
> -----------------------------------------
>
>                 Key: SPARK-4819
>                 URL: https://issues.apache.org/jira/browse/SPARK-4819
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Marcelo Vanzin
>            Assignee: Sean Owen
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to