[ 
https://issues.apache.org/jira/browse/SPARK-3266?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14374859#comment-14374859
 ] 

Jacek Lewandowski commented on SPARK-3266:
------------------------------------------

[~joshrosen] it would be awesome to include this fix into 1.2.x. We've recently 
added cross compilation for Scala 2.10 and Scala 2.11 of our Spark Cassandra 
Connector. It turned out that we were unable to subclass {{JavaPairRDD}} in 
Java when Scala 2.11 compiler was used. The compilation error said:
{noformat}
[error] 
/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/src/main/java/com/datastax/spark/connector/japi/rdd/CassandraJavaPairRDD.java:32:
 error: min(Comparator) in JavaPairRDD cannot implement min(Comparator<T>) in 
JavaRDDLike
[error] public class CassandraJavaPairRDD<K, V> extends JavaPairRDD<K, V> {
[error]        ^
[error]   return type Object is not compatible with Tuple2<K,V>
[error]   where T,K,V are type-variables:
[error]     T extends Object declared in interface JavaRDDLike
[error]     K extends Object declared in class CassandraJavaPairRDD
[error]     V extends Object declared in class CassandraJavaPairRDD
[error] 1 error
[error] (spark-cassandra-connector-java/compile:compile) javac returned nonzero 
exit code
[error] Total time: 3 s, completed Mar 22, 2015 8:46:28 AM
{noformat}

Which seems to be tightly related to this issue.


> JavaDoubleRDD doesn't contain max()
> -----------------------------------
>
>                 Key: SPARK-3266
>                 URL: https://issues.apache.org/jira/browse/SPARK-3266
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 1.0.1, 1.0.2, 1.1.0, 1.2.0
>            Reporter: Amey Chaugule
>            Assignee: Sean Owen
>             Fix For: 1.3.1, 1.4.0
>
>         Attachments: spark-repro-3266.tar.gz
>
>
> While I can compile my code, I see:
> Caused by: java.lang.NoSuchMethodError: 
> org.apache.spark.api.java.JavaDoubleRDD.max(Ljava/util/Comparator;)Ljava/lang/Double;
> When I try to execute my Spark code. Stepping into the JavaDoubleRDD class, I 
> don't notice max()
> although it is clearly listed in the documentation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to