I would vote to support at least the previous Spark release. The big Hadoop distros usually are a version behind in their Spark support. For example, we use MapR which, in their latest release (4.1.0), only supports Spark 1.2.1 and 1.3.1 <http://doc.mapr.com/display/MapR/Ecosystem+Support+Matrix>.
-- Eric On Tue, Jul 7, 2015 at 12:55 PM, Anthony Baker <aba...@pivotal.io> wrote: > Given the rate of change, it doesn’t seem like we should be trying to add > (and maintain) support for every single Spark release. We’re early in the > lifecycle of the Spark connector and too much emphasis on > backwards-compatibility will be a drag on our ongoing development, > particularly since the Spark community is valuing rapid evolution over > stability. > > (apologies if I have misconstrued the state of Spark) > > Anthony > > > > On Jul 6, 2015, at 11:22 PM, Qihong Chen <qc...@pivotal.io> wrote: > > > > The problem is caused by multiple major dependencies and different > release > > cycles. Spark Geode Connector depends on two products: Spark and Geode > (not > > counting other dependencies), and Spark moves much faster than Geode, and > > some features/code are not backward compatible. > > > > Our initial connector implementation depends on Spark 1.2 in before the > > last week of March 15. Then Spark 1.3 was released on the last week of > > March, and some connector feature doesn't work with Spark 1.3, then we > > moved on, and now support Spark 1.3 (but not 1.2 any more, we did create > > tag). Two weeks ago, Spark 1.4 was released, and it breaks our connector > > code again. > > > > Therefore, for each Geode release, we probably need multiple Connector > > releases, and probably need to maintain last 2 or 3 Connector releases, > for > > example, we need to support both Spark 1.3 and 1.4 with the current Geode > > code. > > > > The question is how to support this with single source repository? > > > > Thanks, > > Qihong > >