[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15086092#comment-15086092
 ] 

Sean Owen commented on SPARK-4819:
----------------------------------

That's basically what I did, and then added some Guava API methods onto it as 
well to increase backwards compatibility. But it's in an org.apache.spark 
package. (And I reimplemented it since I don't think we can use Oracle's code 
here.)

Do you mean keep it in the java.util package? That has some appeal but isn't 
that prohibited by the JVM? I forget. I feel like this is inviting the same 
crazy classloader problems we had with the current Guava usage.

> Remove Guava's "Optional" from public API
> -----------------------------------------
>
>                 Key: SPARK-4819
>                 URL: https://issues.apache.org/jira/browse/SPARK-4819
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Marcelo Vanzin
>            Assignee: Sean Owen
>         Attachments: SPARK_4819_null_do_not_merge.patch
>
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to