Common practice at Apache is to provide backwards compatibility starting with 1.x releases.
--sebastian On 28.10.2013 23:47, Jon Hartlaub wrote: > Usually something like APR or SEMVER are useful when thinking about library > versioning and compatibility- helps to standardize things a bit. > > http://apr.apache.org/versioning.html > http://semver.org/ > > > > On Mon, Oct 28, 2013 at 3:39 PM, Mingxi Wu <[email protected]> wrote: > >> that's a valid concern. If api backward compatibility is not maintained or >> minimized, it will be painful for production code upgrade. >> >> >> On Mon, Oct 28, 2013 at 3:29 PM, Mark Hamstra <[email protected] >>> wrote: >> >>> While that is good, it really isn't good enough. Requiring updated >> source >>> code for everything that uses Spark every time Spark goes from x.y.z to >>> x.y.(z+1) is not going to win many friends among developers building on >> top >>> of Spark. Quite the opposite. >>> >>> >>> On Mon, Oct 28, 2013 at 3:25 PM, Reynold Xin <[email protected]> wrote: >>> >>>> Hi Mark, >>>> >>>> I can't comment much on the Spark part right now (because I have to run >>> in >>>> 3 mins), but we will make Shark 0.8.1 work with Spark 0.8.1 for sure. >>> Some >>>> of the changes will get cherry picked into branch-0.8 of Shark. >>>> >>>> >>>> On Mon, Oct 28, 2013 at 6:22 PM, Mark Hamstra <[email protected] >>>>> wrote: >>>> >>>>> Or more to the point: What is our commitment to backward >> compatibility >>> in >>>>> point releases? >>>>> >>>>> Many Java developers will come to a library or platform versioned as >>>> x.y.z >>>>> with the expectation that if their own code worked well using >> x.y.(z-1) >>>> as >>>>> a dependency, then moving up to x.y.z will be painless and trivial. >>> That >>>>> is not looking like it will be the case for Spark 0.8.0 and 0.8.1. >>>>> >>>>> We only need to look at Shark as an example of code built with a >>>> dependency >>>>> on Spark to see the problem. Shark 0.8.0 works with Spark 0.8.0. >>> Shark >>>>> 0.8.0 does not build with Spark 0.8.1-SNAPSHOT. Presumably that lack >>> of >>>>> backwards compatibility will continue into the eventual release of >>> Spark >>>>> 0.8.1, and that makes life hard on developers using Spark and Shark. >>> For >>>>> example, a developer using the released version of Shark but wanting >> to >>>>> pick up the bug fixes in Spark doesn't have a good option anymore >> since >>>>> 0.8.1-SNAPSHOT (or the eventual 0.8.1 release) doesn't work, and >> moving >>>> to >>>>> the wild and woolly development on the master branches of Spark and >>> Shark >>>>> is not a good idea for someone trying to develop production code. In >>>> other >>>>> words, all of the bug fixes in Spark 0.8.1 are not accessible to this >>>>> developer until such time as there are available 0.8.1-compatible >>>> versions >>>>> of Shark and anything else built on Spark that this developer is >> using. >>>>> >>>>> The only other option is trying to cherry-pick commits from, e.g., >>> Shark >>>>> 0.9.0-SNAPSHOT into Shark 0.8.0 until Shark 0.8.0 has been brought up >>> to >>>> a >>>>> point where it works with Spark 0.8.1. But an application developer >>>>> shouldn't need to do that just to get the bug fixes in Spark 0.8.1, >> and >>>> it >>>>> is not immediately obvious just which Shark commits are necessary and >>>>> sufficient to produce a correct, Spark-0.8.1-compatible version of >>> Shark >>>>> (indeed, there is no guarantee that such a thing is even possible.) >>>> Right >>>>> now, I believe that 67626ae3eb6a23efc504edf5aedc417197f072cf, >>>>> 488930f5187264d094810f06f33b5b5a2fde230a and >>>>> bae19222b3b221946ff870e0cee4dba0371dea04 are necessary to get Shark >> to >>>> work >>>>> with Spark 0.8.1-SNAPSHOT, but that those commits are not sufficient >>>> (Shark >>>>> builds against Spark 0.8.1-SNAPSHOT with those cherry-picks, but I'm >>>> still >>>>> seeing runtime errors.) >>>>> >>>>> In short, this is not a good situation, and we probably need a real >> 0.8 >>>>> maintenance branch that maintains backward compatibility with 0.8.0, >>>>> because (at least to me) the current branch-0.8 of Spark looks more >>> like >>>>> another active development branch (in addition to the master and >>>> scala-2.10 >>>>> branches) than it does a maintenance branch. >>>>> >>>> >>> >> >
