[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2016-01-06 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15086092#comment-15086092
 ] 

Sean Owen commented on SPARK-4819:
--

That's basically what I did, and then added some Guava API methods onto it as 
well to increase backwards compatibility. But it's in an org.apache.spark 
package. (And I reimplemented it since I don't think we can use Oracle's code 
here.)

Do you mean keep it in the java.util package? That has some appeal but isn't 
that prohibited by the JVM? I forget. I feel like this is inviting the same 
crazy classloader problems we had with the current Guava usage.

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>Assignee: Sean Owen
> Attachments: SPARK_4819_null_do_not_merge.patch
>
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2016-01-06 Thread Markus Weimer (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15086083#comment-15086083
 ] 

Markus Weimer commented on SPARK-4819:
--

Over in REEF, we just backportd Java 8's [[Optional]] class. Would that be 
an option for Spark as well? If so, maybe this could go to one of the commons 
packages such that we don't clone the same trivial class over and over again :-)

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>Assignee: Sean Owen
> Attachments: SPARK_4819_null_do_not_merge.patch
>
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2015-12-29 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15074380#comment-15074380
 ] 

Apache Spark commented on SPARK-4819:
-

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/10513

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>Assignee: Sean Owen
> Attachments: SPARK_4819_null_do_not_merge.patch
>
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2015-12-29 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15074001#comment-15074001
 ] 

Sean Owen commented on SPARK-4819:
--

OK, I wrestled with this a couple hours and think this will fail: the problem 
is, adding the type constraint ultimately means most every signature in the 
Java APIs has to include the constraint. Then this starts to cause funny 
problems: you can't have the RDD.toJavaRDD method anymore since it can't 
guarantee the bound. Eventually the only unfixable problem I encountered had to 
do with the workaround for Java varags in the union() method. This hackery to 
work around scalac issues just won't fly; the type bound on the Scala code 
extending the Java class isn't allowed.

Options seem to be:

- Push further ahead and drop the varargs version of JavaSparkContext.union() 
as collateral damage and see what other surprises are in store
- Copy and paste the Optional class from Guava and make it an official Spark 
API class
- Require Java 8 and use its version
- ... copy and paste Java 8's version

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>Assignee: Sean Owen
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2015-03-13 Thread Marcelo Vanzin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14361558#comment-14361558
 ] 

Marcelo Vanzin commented on SPARK-4819:
---

Java 8 has {{java.lang.Optional}} which looks eerily like Guava's, so that 
might be an easy way out. I guess it would be ok to bump the requirement to 
Java 8 at that point (given Java 7 is EOL this year).

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4819) Remove Guava's "Optional" from public API

2015-02-08 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14311610#comment-14311610
 ] 

Sean Owen commented on SPARK-4819:
--

Yeah there are several things we can't do until Spark 2.x, but should do. Would 
it be OK to start a "2.0.0" version in JIRA to assign these to? I'm also 
thinking of SPARK-3266 and SPARK-3369.

> Remove Guava's "Optional" from public API
> -
>
> Key: SPARK-4819
> URL: https://issues.apache.org/jira/browse/SPARK-4819
> Project: Spark
>  Issue Type: Task
>  Components: Spark Core
>Affects Versions: 1.2.0
>Reporter: Marcelo Vanzin
>
> Filing this mostly so this isn't forgotten. Spark currently exposes Guava 
> types in its public API (the {{Optional}} class is used in the Java 
> bindings). This makes it hard to properly hide Guava from user applications, 
> and makes mixing different Guava versions with Spark a little sketchy (even 
> if things should work, since those classes are pretty simple in general).
> Since this changes the public API, it has to be done in a release that allows 
> such breakages. But it would be nice to at least have a transition plan for 
> deprecating the affected APIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org