Another consideration to keep in mind is other backends - used or not - 
such as MR, GPU, library, possibly Flink, ...

Regards,
Berthold Reinwald
IBM Almaden Research Center
office: (408) 927 2208; T/L: 457 2208
e-mail: reinw...@us.ibm.com



From:   Matthias Boehm/Almaden/IBM@IBMUS
To:     dev@systemml.incubator.apache.org
Date:   08/25/2016 10:59 AM
Subject:        Re: [DISCUSS] Apache SystemML Release 1.0.0



I'm still not fully convinced that we need to drop Spark 1.x support, 
instead of supporting both 1.x and 2.x. I would appreciate if we could 
first conclude the discussion around migrating to Spark 2.0. 

Furthermore, I think that creating a dependency to Spark versioning would 
unnecessarily complicate our own release process. I would rather use major 
releases as an opportunity to cleanup APIs and drop certain language 
features. And this is unlikely to coincide with Spark's releases. From my 
perspective it would be even more confusing for a user to release a major 
version for a relatively minor change as support for a new Spark version. 

Regards,
Matthias


Luciano Resende ---08/25/2016 07:15:36 PM---On Wed, Aug 24, 2016 at 6:11 
PM, <dusenberr...@gmail.com> wrote: > Yes I'm also in favor of moving t

From: Luciano Resende <luckbr1...@gmail.com>
To: dev@systemml.incubator.apache.org
Date: 08/25/2016 07:15 PM
Subject: Re: [DISCUSS] Apache SystemML Release 1.0.0



On Wed, Aug 24, 2016 at 6:11 PM, <dusenberr...@gmail.com> wrote:

> Yes I'm also in favor of moving to a 1.0 version for our upcoming 
release
> targeting the Spark 1.x series. Since we'll also be subsequently 
releasing
> a version targeting the Spark 2.x series, I would also like to suggest 
that
> we name that version 2.0. This version naming scheme would allow us to
> easily associate a SystemML version with the Spark series that it 
targets,
> thus reducing confusion for a user. Rather than view a 2.0 version as a
> successor to 1.0, let's view it instead as simply a naming scheme that
> corresponds to the targeted version of Spark.
>
> So, 1.0 would be our upcoming release targeting Spark 1.x, and 2.0 would
> be our upcoming release targeting Spark 2.x.
>
>
+ 1



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/





Reply via email to