Hi Patrick,

What are the expectations / guarantees on binary compatibility between
0.9 and 1.0?

You mention some API changes, which kinda hint that binary
compatibility has already been broken, but just wanted to point out
there are other cases. e.g.:

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:236)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:47)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoSuchMethodError:
org.apache.spark.SparkContext$.rddToOrderedRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/Function1;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/OrderedRDDFunctions;

(Compiled against 0.9, run against 1.0.)
Offending code:

      val top10 = counts.sortByKey(false).take(10)

Recompiling fixes the problem.


On Tue, Apr 29, 2014 at 1:05 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> Hey All,
>
> This is not an official vote, but I wanted to cut an RC so that people can
> test against the Maven artifacts, test building with their configuration,
> etc. We are still chasing down a few issues and updating docs, etc.
>
> If you have issues or bug reports for this release, please send an e-mail
> to the Spark dev list and/or file a JIRA.
>
> Commit: d636772 (v1.0.0-rc3)
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=d636772ea9f98e449a038567b7975b1a07de3221
>
> Binaries:
> http://people.apache.org/~pwendell/spark-1.0.0-rc3/
>
> Docs:
> http://people.apache.org/~pwendell/spark-1.0.0-rc3-docs/
>
> Repository:
> https://repository.apache.org/content/repositories/orgapachespark-1012/
>
> == API Changes ==
> If you want to test building against Spark there are some minor API
> changes. We'll get these written up for the final release but I'm noting a
> few here (not comprehensive):
>
> changes to ML vector specification:
> http://people.apache.org/~pwendell/spark-1.0.0-rc3-docs/mllib-guide.html#from-09-to-10
>
> changes to the Java API:
> http://people.apache.org/~pwendell/spark-1.0.0-rc3-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
>
> coGroup and related functions now return Iterable[T] instead of Seq[T]
> ==> Call toSeq on the result to restore the old behavior
>
> SparkContext.jarOfClass returns Option[String] instead of Seq[String]
> ==> Call toSeq on the result to restore old behavior
>
> Streaming classes have been renamed:
> NetworkReceiver -> Receiver



-- 
Marcelo

Reply via email to