excellent - thanks for the quick fix Deron.

Regards,
Matthias

On 2/21/2017 1:09 AM, Deron Eriksson wrote:
Note that MLContext has been updated to log a warning rather than throw an
exception to the user for Spark versions previous to 2.1.0.

Deron

On Mon, Feb 20, 2017 at 2:29 PM, Matthias Boehm1 <matthias.boe...@ibm.com>
wrote:

that's a good catch Felix! I would recommend to cast this exception to a
warning and move it to a central place like SparkExecutionContext to ensure
consistency across all APIs and deployments.

Regards,
Matthias


[image: Inactive hide details for Deron Eriksson ---02/20/2017 02:14:00
PM---Hi Felix, I agree that the 2.1 hard requirement is a bit r]Deron
Eriksson ---02/20/2017 02:14:00 PM---Hi Felix, I agree that the 2.1 hard
requirement is a bit restrictive. If someone can

From: Deron Eriksson <deroneriks...@gmail.com>
To: dev@systemml.incubator.apache.org
Date: 02/20/2017 02:14 PM
Subject: Re: Minimum required Spark version
------------------------------



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM, <fschue...@posteo.de> wrote:

Hi,

the current master and 0.13 release have a hard requirement in MLContext
for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
Only
supporting the latest Spark release seems a little restrictive to me.


-Felix




--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/






Reply via email to