[ https://issues.apache.org/jira/browse/SPARK-6399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14486595#comment-14486595 ]
Patrick Wendell commented on SPARK-6399: ---------------------------------------- It would be good to document more clearly what compatibility we intend to provide. I am not so sure that forward compatibility is a stated or necessary goal for binary interfaces. I think we should just provide backwards compatibility for those interfaces (though in practice these will almost always be the same except for some issues like this with implicits). The main area we've had really strict enforcement of forward compatibility has been around the serialization format of JSON logs, since we want it to be easy for people to use the Spark history server with newer versions of Spark in a multi-tenant cluster. > Code compiled against 1.3.0 may not run against older Spark versions > -------------------------------------------------------------------- > > Key: SPARK-6399 > URL: https://issues.apache.org/jira/browse/SPARK-6399 > Project: Spark > Issue Type: Bug > Components: Documentation, Spark Core > Affects Versions: 1.3.0 > Reporter: Marcelo Vanzin > > Commit 65b987c3 re-organized the implicit conversions of RDDs so that they're > easier to use. The problem is that scalac now generates code that will not > run on older Spark versions if those conversions are used. > Basically, even if you explicitly import {{SparkContext._}}, scalac will > generate references to the new methods in the {{RDD}} object instead. So the > compiled code will reference code that doesn't exist in older versions of > Spark. > You can work around this by explicitly calling the methods in the > {{SparkContext}} object, although that's a little ugly. > We should at least document this limitation (if there's no way to fix it), > since I believe forwards compatibility in the API was also a goal. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org