+1

On Thu, Oct 27, 2016 at 3:15 AM, Reynold Xin <r...@databricks.com> wrote:

> I created a JIRA ticket to track this: https://issues.apache.
> org/jira/browse/SPARK-18138
>
>
>
> On Thu, Oct 27, 2016 at 10:19 AM, Steve Loughran <ste...@hortonworks.com>
> wrote:
>
>>
>> On 27 Oct 2016, at 10:03, Sean Owen <so...@cloudera.com> wrote:
>>
>> Seems OK by me.
>> How about Hadoop < 2.6, Python 2.6? Those seem more removeable. I'd like
>> to add that to a list of things that will begin to be unsupported 6 months
>> from now.
>>
>>
>> If you go to java 8 only, then hadoop 2.6+ is mandatory.
>>
>>
>> On Wed, Oct 26, 2016 at 8:49 PM Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> that sounds good to me
>>>
>>> On Wed, Oct 26, 2016 at 2:26 PM, Reynold Xin <r...@databricks.com>
>>> wrote:
>>>
>>> We can do the following concrete proposal:
>>>
>>> 1. Plan to remove support for Java 7 / Scala 2.10 in Spark 2.2.0
>>> (Mar/Apr 2017).
>>>
>>> 2. In Spark 2.1.0 release, aggressively and explicitly announce the
>>> deprecation of Java 7 / Scala 2.10 support.
>>>
>>> (a) It should appear in release notes, documentations that mention how
>>> to build Spark
>>>
>>> (b) and a warning should be shown every time SparkContext is started
>>> using Scala 2.10 or Java 7.
>>>
>>>
>>
>

Reply via email to