[ 
https://issues.apache.org/jira/browse/SPARK-6399?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin resolved SPARK-6399.
-----------------------------------
    Resolution: Won't Fix

I think at this point it's pretty clear we won't do anything here.

> Code compiled against 1.3.0 may not run against older Spark versions
> --------------------------------------------------------------------
>
>                 Key: SPARK-6399
>                 URL: https://issues.apache.org/jira/browse/SPARK-6399
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Marcelo Vanzin
>            Priority: Minor
>
> Commit 65b987c3 re-organized the implicit conversions of RDDs so that they're 
> easier to use. The problem is that scalac now generates code that will not 
> run on older Spark versions if those conversions are used.
> Basically, even if you explicitly import {{SparkContext._}}, scalac will 
> generate references to the new methods in the {{RDD}} object instead. So the 
> compiled code will reference code that doesn't exist in older versions of 
> Spark.
> You can work around this by explicitly calling the methods in the 
> {{SparkContext}} object, although that's a little ugly.
> We should at least document this limitation (if there's no way to fix it), 
> since I believe forwards compatibility in the API was also a goal.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to