Just to be clear, instead of creating a branch to merge the 2.0 support, we 
will want to merge the 2.0 support into the master branch.

--

Mike Dusenberry
GitHub: github.com/dusenberrymw
LinkedIn: linkedin.com/in/mikedusenberry

Sent from my iPhone.


> On Jan 9, 2017, at 12:02 PM, Acs S <ac...@yahoo.com.INVALID> wrote:
> 
> Based on discussion thread we will start creating SystemML release based on 
> Spark 2.0.
> There are bunch of activities need to be completed and we need volunteer for 
> most of them.
> Activity                                                                      
>                                                                           
> Volunteer1. Create a branch based on SystemML 0.12 release to merge Spark 2.0 
> code                                Luciano2. Get Spark 2.0 PR merged to this 
> new branch.                                                                   
>              Glenn3. Do build changes to have both Spark 1.6 and 2.0 builds 
> for release and PR.                          (Someone needs to work with Alan)
> 4. Setup Spark 2.0 cluster (One of the Almaden cluster updated with Spark 
> 2.0)5. Create Release Candidate                                               
>                                                                 Glenn, Deron, 
> Arvind6. Performance Testing7. Notebook testing                               
>                                                                               
>                  Arvind8. Python DSL verification (2.x and 3.x)9. Scala DSL 
> verification10. Artifacts verification11. Documentation update.
> 
> -----------------Arvind SurveSpark Technology Centerhttp://www.spark.tc
> 
>      From: Niketan Pansare <npan...@us.ibm.com>
> To: dev@systemml.incubator.apache.org 
> Sent: Friday, January 6, 2017 1:12 PM
> Subject: Re: Time To Merge Spark 2.0 Support PR
> 
> I am fine with creating a branch for Spark 1.6 support and merging Spark 2.0 
> PR then. Like Luciano said, we can creating a release 0.12 from our Spark 1.6 
> branch. 
> 
> Overriding previous release is common practice for pip installer, however 
> pypi does maintain the history of releases. Once a release candidate 0.12 is 
> created, the user can install SystemML python package in three ways:
> 1. From source by checking out the branch and executing: mvn package -P 
> distribution, followed by pip install 
> target/systemml-0.12.0-incubating-python.tgz
> 2. From Apache site, pip install 
> http://www.apache.org/dyn/closer.lua/incubator/systemml/0.12.0-incubating/systemml-0.12.0-incubating-python.tgz
> 3. From pypi by specifying the version, pip install -I 
> systemml_incubating==0.12
> 
> As long as we ensure that version of the python package on pypi matches our 
> release version and we document the Spark support in our release notes, there 
> should not be any confusion on usage :)
> 
> Thanks,
> 
> Niketan Pansare
> IBM Almaden Research Center
> E-mail: npansar At us.ibm.com
> http://researcher.watson.ibm.com/researcher/view.php?person=us-npansar
> 
> Acs S ---01/06/2017 12:57:53 PM---I would agree to create a branch and add 
> Spark 2.0 to it, while still releasing SystemML 0.12 releas
> 
> From: Acs S <ac...@yahoo.com.INVALID>
> To: "dev@systemml.incubator.apache.org" <dev@systemml.incubator.apache.org>
> Date: 01/06/2017 12:57 PM
> Subject: Re: Time To Merge Spark 2.0 Support PR
> 
> 
> 
> I would agree to create a branch and add Spark 2.0 to it, while still 
> releasing SystemML 0.12 release with Pip Install Artifact.
> Regarding comment from Mike, that new SystemML release will update PyPy 
> package.Shouldn't it be tagged with version #? Otherwise every release will 
> override previous one.Niketan, any comments?
> -Arvind
> 
>      From: Matthias Boehm <mboe...@googlemail.com>
> To: dev@systemml.incubator.apache.org 
> Sent: Friday, January 6, 2017 12:52 PM
> Subject: Re: Time To Merge Spark 2.0 Support PR
>   
> +1 on moving to Spark 2.x - I think we delayed this way too long now and 
> there will always be some awesome feature that we'd want to support on 
> older Spark versions too.
> 
> Regards,
> Matthias
> 
>> On 1/6/2017 9:41 PM, Mike Dusenberry wrote:
>> Well to be fair, a user can still use the Python DSL with the SystemML 0.11
>> release by using `pip install -e src/main/python`.  We just didn't place a
>> separate Python binary on the release website.  Keep in mind as well that
>> once we release the next release with Spark 2.x support, a Spark 1.6 will
>> not be able to use `pip install systemml` anyway, as that PyPy package will
>> have been updated to the latest Spark 2.0 release.
>> 
>> I'm concerned that we are moving too slowly as a project without a clear
>> set of users holding us back on 1.6.  Thoughts?
>> 
>> 
>> --
>> 
>> Michael W. Dusenberry
>> GitHub: github.com/dusenberrymw
>> LinkedIn: linkedin.com/in/mikedusenberry
>> 
>>> On Fri, Jan 6, 2017 at 12:33 PM, Acs S <ac...@yahoo.com.invalid> wrote:
>>> 
>>> With SystemML 0.11 release we don't have Python DSL support on Spark
>>> 1.6.To have Python DSL on Spark 1.6 support we have to release SystemML
>>> with Pip Install artifact, which was planned for SystemML 0.12.
>>> I understand Spark 2.0 has released, at the same time there are Spark 1.6
>>> users who will not able to use SystemML with Python DSL if we simply move
>>> to Spark 2.0 without this artifact as planned. Understand SystemML 0.12 has
>>> been delayed due to holiday period, but lets not get panicked and jump the
>>> hops.
>>> -Arvind
>>> 
>>>       From: Deron Eriksson <deroneriks...@gmail.com>
>>>   To: dev@systemml.incubator.apache.org
>>> Cc: Acs S <ac...@yahoo.com>
>>>   Sent: Friday, January 6, 2017 12:21 PM
>>>   Subject: Re: Time To Merge Spark 2.0 Support PR
>>> 
>>> I would prefer to move forward with Spark 2 now.
>>> 
>>> Deron
>>> 
>>> 
>>> 
>>> On Fri, Jan 6, 2017 at 12:15 PM, Mike Dusenberry <dusenberr...@gmail.com>
>>> wrote:
>>> 
>>>> I vote that we scratch that release, and move forward with adding Spark
>>> 2.x
>>>> support now, and then release on February 1st as discussed on another
>>>> thread this week.  Otherwise, our PR board will continue to be backed up
>>>> without real benefit to the user.
>>>> 
>>>> - Mike
>>>> 
>>>> 
>>>> --
>>>> 
>>>> Michael W. Dusenberry
>>>> GitHub: github.com/dusenberrymw
>>>> LinkedIn: linkedin.com/in/mikedusenberry
>>>> 
>>>>> On Fri, Jan 6, 2017 at 12:03 PM, Acs S <ac...@yahoo.com.invalid> wrote:
>>>>> 
>>>>> We are trying to get SystemML 0.12 release out, which slowed down due
>>> to
>>>>> holiday period in last 2+ weeksFinal verification pending from Glen
>>>>> (Niketan, Berthed) related to run wrappers related code with 80GB data.
>>>>> Once this get verified we should be in position to release 0.12 and
>>> then
>>>>> can plan to get PR related to Spark 2.0 merged.
>>>>> -Arvind
>>>>> 
>>>>>       From: Mike Dusenberry <dusenberr...@gmail.com>
>>>>>   To: dev <dev@systemml.incubator.apache.org>
>>>>>   Sent: Friday, January 6, 2017 11:54 AM
>>>>>   Subject: Time To Merge Spark 2.0 Support PR
>>>>> 
>>>>> Hi to the SystemML community!
>>>>> 
>>>>> As you may know, SystemML currently only supports Spark 1.6, and does
>>> not
>>>>> yet support Spark 2.x.  However, there is an open PR (
>>>>> https://github.com/apache/incubator-systemml/pull/202) that replaces
>>>> Spark
>>>>> 1.6 support with Spark 2.x support.
>>>>> 
>>>>> Spark 2.0.0 was released on July 26, 2016 (
>>>>> https://spark.apache.org/news/spark-2-0-0-released.html), and 2.1.0 is
>>>> now
>>>>> available (https://spark.apache.org/news/spark-2-1-0-released.html).
>>>>> 
>>>>> I think it is time to merge the above PR into the master branch of the
>>>>> project in order to move SystemML onto Spark 2.x.
>>>>> 
>>>>> Thoughts?  If no objections, I'd like to merge next week.
>>>>> 
>>>>> 
>>>>> Cheers!
>>>>> 
>>>>> - Mike
>>>>> 
>>>>> --
>>>>> 
>>>>> Michael W. Dusenberry
>>>>> GitHub: github.com/dusenberrymw
>>>>> LinkedIn: linkedin.com/in/mikedusenberry
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>> 
>>> 
>>> 
>>> 
>>> --
>>> Deron Eriksson
>>> Spark Technology Center
>>> http://www.spark.tc/
>>> 
>>> 
>>> 
>>> 
>> 
> 
> 
>   
> 
> 
> 
> 

Reply via email to