[jira] [Updated] (SPARK-4819) Remove Guava's "Optional" from public API
[ https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-4819: - Attachment: SPARK_4819_null_do_not_merge.patch Here was what I tried when replacing Optional with use of null. It just went well out of hand even after figuring out an ugly workaround for varargs. It ended up being a good change in some ways (correctly expressing the bound for Java RDDs, fixing up a Scala/Java Long problem, removing some fake class tags) but this ended up far too unwieldy > Remove Guava's "Optional" from public API > - > > Key: SPARK-4819 > URL: https://issues.apache.org/jira/browse/SPARK-4819 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 1.2.0 >Reporter: Marcelo Vanzin >Assignee: Sean Owen > Attachments: SPARK_4819_null_do_not_merge.patch > > > Filing this mostly so this isn't forgotten. Spark currently exposes Guava > types in its public API (the {{Optional}} class is used in the Java > bindings). This makes it hard to properly hide Guava from user applications, > and makes mixing different Guava versions with Spark a little sketchy (even > if things should work, since those classes are pretty simple in general). > Since this changes the public API, it has to be done in a release that allows > such breakages. But it would be nice to at least have a transition plan for > deprecating the affected APIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4819) Remove Guava's "Optional" from public API
[ https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-4819: - Assignee: Sean Owen (was: Reynold Xin) I'm going to give this a try. The game plan is roughly to replace Optional with T, where the value can be null. This is the closest equivalent in the Java 7 JDK. Of course, Java 8 would be nice here too, and would be pleased to see deciding to require Java 8 for 2.x, but that's a different question. One catch: It's simple to use {{v.orNull}} on an instance of Scala's {{Option[V]}} to get the value or null, for use with Java. It's like the mirror to {{Option(v)}}. However this only works if Scala knows type V is a reference type. So this means adding ">: Null" bounds to a lot of types in the Java API. Which isn't a bad idea, in that Java RDDs can only contain objects and not primitives anyway. Comments? > Remove Guava's "Optional" from public API > - > > Key: SPARK-4819 > URL: https://issues.apache.org/jira/browse/SPARK-4819 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 1.2.0 >Reporter: Marcelo Vanzin >Assignee: Sean Owen > > Filing this mostly so this isn't forgotten. Spark currently exposes Guava > types in its public API (the {{Optional}} class is used in the Java > bindings). This makes it hard to properly hide Guava from user applications, > and makes mixing different Guava versions with Spark a little sketchy (even > if things should work, since those classes are pretty simple in general). > Since this changes the public API, it has to be done in a release that allows > such breakages. But it would be nice to at least have a transition plan for > deprecating the affected APIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4819) Remove Guava's "Optional" from public API
[ https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Reynold Xin updated SPARK-4819: --- Issue Type: Sub-task (was: Task) Parent: SPARK-11806 > Remove Guava's "Optional" from public API > - > > Key: SPARK-4819 > URL: https://issues.apache.org/jira/browse/SPARK-4819 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 1.2.0 >Reporter: Marcelo Vanzin > > Filing this mostly so this isn't forgotten. Spark currently exposes Guava > types in its public API (the {{Optional}} class is used in the Java > bindings). This makes it hard to properly hide Guava from user applications, > and makes mixing different Guava versions with Spark a little sketchy (even > if things should work, since those classes are pretty simple in general). > Since this changes the public API, it has to be done in a release that allows > such breakages. But it would be nice to at least have a transition plan for > deprecating the affected APIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4819) Remove Guava's "Optional" from public API
[ https://issues.apache.org/jira/browse/SPARK-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-4819: - Target Version/s: 2+ > Remove Guava's "Optional" from public API > - > > Key: SPARK-4819 > URL: https://issues.apache.org/jira/browse/SPARK-4819 > Project: Spark > Issue Type: Task > Components: Spark Core >Affects Versions: 1.2.0 >Reporter: Marcelo Vanzin > > Filing this mostly so this isn't forgotten. Spark currently exposes Guava > types in its public API (the {{Optional}} class is used in the Java > bindings). This makes it hard to properly hide Guava from user applications, > and makes mixing different Guava versions with Spark a little sketchy (even > if things should work, since those classes are pretty simple in general). > Since this changes the public API, it has to be done in a release that allows > such breakages. But it would be nice to at least have a transition plan for > deprecating the affected APIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org