About version compatibility and upgrade path -  can the Java application
dependencies and the Spark server be upgraded separately (i.e. will 1.1.0
library work with 1.1.1 server, and vice versa), or do they need to be
upgraded together?

Thanks!

*Romi Kuntsman*, *Big Data Engineer*
 http://www.totango.com

On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or <and...@databricks.com> wrote:

> I am happy to announce the availability of Spark 1.1.1! This is a
> maintenance release with many bug fixes, most of which are concentrated in
> the core. This list includes various fixes to sort-based shuffle, memory
> leak, and spilling issues. Contributions from this release came from 55
> developers.
>
> Visit the release notes [1] to read about the new features, or
> download [2] the release today.
>
> [1] http://spark.apache.org/releases/spark-release-1-1-1.html
> [2] http://spark.apache.org/downloads.html
>
> Please e-mail me directly for any typo's in the release notes or name
> listing.
>
> Thanks for everyone who contributed, and congratulations!
> -Andrew
>

Reply via email to