Re: Announcing Spark 1.1.1!

2014-12-03 Thread rzykov
Andrew and developers, thank you for excellent release! It fixed almost all of our issues. Now we are migrating to Spark from Zoo of Python, Java, Hive, Pig jobs. Our Scala/Spark jobs often failed on 1.1. Spark 1.1.1 works like a Swiss watch. -- View this message in context:

Re: Announcing Spark 1.1.1!

2014-12-03 Thread Romi Kuntsman
About version compatibility and upgrade path - can the Java application dependencies and the Spark server be upgraded separately (i.e. will 1.1.0 library work with 1.1.1 server, and vice versa), or do they need to be upgraded together? Thanks! *Romi Kuntsman*, *Big Data Engineer*

Re: Announcing Spark 1.1.1!

2014-12-03 Thread Andrew Or
By the Spark server do you mean the standalone Master? It is best if they are upgraded together because there have been changes to the Master in 1.1.1. Although it might just work, it's highly recommended to restart your cluster manager too. 2014-12-03 13:19 GMT-08:00 Romi Kuntsman

Re: Announcing Spark 1.1.1!

2014-12-03 Thread Aaron Davidson
Because this was a maintenance release, we should not have introduced any binary backwards or forwards incompatibilities. Therefore, applications that were written and compiled against 1.1.0 should still work against a 1.1.1 cluster, and vice versa. On Wed, Dec 3, 2014 at 1:30 PM, Andrew Or

Announcing Spark 1.1.1!

2014-12-02 Thread Andrew Or
I am happy to announce the availability of Spark 1.1.1! This is a maintenance release with many bug fixes, most of which are concentrated in the core. This list includes various fixes to sort-based shuffle, memory leak, and spilling issues. Contributions from this release came from 55 developers.