Minimum required Spark version

2017-02-20 Thread fschueler

Hi,

the current master and 0.13 release have a hard requirement in MLContext 
for Spark 2.1. Is this really necessary or could we set it to >= 2.0? 
Only supporting the latest Spark release seems a little restrictive to 
me.



-Felix


Re: Minimum required Spark version

2017-02-20 Thread Deron Eriksson
Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM,  wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0? Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



-- 
Deron Eriksson
Spark Technology Center
http://www.spark.tc/


Re: Minimum required Spark version

2017-02-20 Thread Matthias Boehm1

that's a good catch Felix! I would recommend to cast this exception to a
warning and move it to a central place like SparkExecutionContext to ensure
consistency across all APIs and deployments.

Regards,
Matthias




From:   Deron Eriksson 
To: dev@systemml.incubator.apache.org
Date:   02/20/2017 02:14 PM
Subject:    Re: Minimum required Spark version



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM,  wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/




Re: Minimum required Spark version

2017-02-21 Thread Deron Eriksson
Note that MLContext has been updated to log a warning rather than throw an
exception to the user for Spark versions previous to 2.1.0.

Deron

On Mon, Feb 20, 2017 at 2:29 PM, Matthias Boehm1 
wrote:

> that's a good catch Felix! I would recommend to cast this exception to a
> warning and move it to a central place like SparkExecutionContext to ensure
> consistency across all APIs and deployments.
>
> Regards,
> Matthias
>
>
> [image: Inactive hide details for Deron Eriksson ---02/20/2017 02:14:00
> PM---Hi Felix, I agree that the 2.1 hard requirement is a bit r]Deron
> Eriksson ---02/20/2017 02:14:00 PM---Hi Felix, I agree that the 2.1 hard
> requirement is a bit restrictive. If someone can
>
> From: Deron Eriksson 
> To: dev@systemml.incubator.apache.org
> Date: 02/20/2017 02:14 PM
> Subject: Re: Minimum required Spark version
> --
>
>
>
> Hi Felix,
>
> I agree that the 2.1 hard requirement is a bit restrictive. If someone can
> validate that Spark versions less than 2.1 and greater than 2.0.* work,
> this seems like a great idea to me.
>
> Deron
>
>
> On Mon, Feb 20, 2017 at 1:43 PM,  wrote:
>
> > Hi,
> >
> > the current master and 0.13 release have a hard requirement in MLContext
> > for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
> Only
> > supporting the latest Spark release seems a little restrictive to me.
> >
> >
> > -Felix
> >
>
>
>
> --
> Deron Eriksson
> Spark Technology Center
> http://www.spark.tc/
>
>
>
>


-- 
Deron Eriksson
Spark Technology Center
http://www.spark.tc/


Re: Minimum required Spark version

2017-02-21 Thread Matthias Boehm

excellent - thanks for the quick fix Deron.

Regards,
Matthias

On 2/21/2017 1:09 AM, Deron Eriksson wrote:

Note that MLContext has been updated to log a warning rather than throw an
exception to the user for Spark versions previous to 2.1.0.

Deron

On Mon, Feb 20, 2017 at 2:29 PM, Matthias Boehm1 
wrote:


that's a good catch Felix! I would recommend to cast this exception to a
warning and move it to a central place like SparkExecutionContext to ensure
consistency across all APIs and deployments.

Regards,
Matthias


[image: Inactive hide details for Deron Eriksson ---02/20/2017 02:14:00
PM---Hi Felix, I agree that the 2.1 hard requirement is a bit r]Deron
Eriksson ---02/20/2017 02:14:00 PM---Hi Felix, I agree that the 2.1 hard
requirement is a bit restrictive. If someone can

From: Deron Eriksson 
To: dev@systemml.incubator.apache.org
Date: 02/20/2017 02:14 PM
Subject: Re: Minimum required Spark version
--



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM,  wrote:


Hi,

the current master and 0.13 release have a hard requirement in MLContext
for Spark 2.1. Is this really necessary or could we set it to >= 2.0?

Only

supporting the latest Spark release seems a little restrictive to me.


-Felix





--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/









Re: Minimum required Spark version

2017-02-21 Thread fschueler

Thanks Deron :)

Am 21.02.2017 10:09 schrieb Deron Eriksson:
Note that MLContext has been updated to log a warning rather than throw 
an

exception to the user for Spark versions previous to 2.1.0.

Deron

On Mon, Feb 20, 2017 at 2:29 PM, Matthias Boehm1 


wrote:

that's a good catch Felix! I would recommend to cast this exception to 
a
warning and move it to a central place like SparkExecutionContext to 
ensure

consistency across all APIs and deployments.

Regards,
Matthias


[image: Inactive hide details for Deron Eriksson ---02/20/2017 
02:14:00

PM---Hi Felix, I agree that the 2.1 hard requirement is a bit r]Deron
Eriksson ---02/20/2017 02:14:00 PM---Hi Felix, I agree that the 2.1 
hard

requirement is a bit restrictive. If someone can

From: Deron Eriksson 
To: dev@systemml.incubator.apache.org
Date: 02/20/2017 02:14 PM
Subject: Re: Minimum required Spark version
--



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone 
can
validate that Spark versions less than 2.1 and greater than 2.0.* 
work,

this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM,  wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/