There is a way only Spark use Java 8, Hadoop still use Java 7:
spark-conf.jpg
(58K)
<https://mail.google.com/mail/u/0/?ui=2&ik=a9af0e7eb1&view=att&th=15a3f68a367de778&attid=0.1&disp=safe&realattid=f_iz6aduk80&zw>



By the way, I have a way to install any spark version on CM5.4 - CM5.7 by
custom CSD <https://github.com/wangyum/cm_csds/tree/master/SPARK> and
custom Spark parcel <https://github.com/wangyum/spark-parcel>.

On Wed, Feb 15, 2017 at 6:46 AM, Koert Kuipers <ko...@tresata.com> wrote:

> what about the conversation about dropping scala 2.10?
>
> On Fri, Feb 10, 2017 at 11:47 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> As you have seen, there's a WIP PR to implement removal of Java 7
>> support: https://github.com/apache/spark/pull/16871
>>
>> I have heard several +1s at https://issues.apache.org/j
>> ira/browse/SPARK-19493 but am asking for concerns too, now that there's
>> a concrete change to review.
>>
>> If this goes in for 2.2 it can be followed by more extensive update of
>> the Java code to take advantage of Java 8; this is more or less the
>> baseline change.
>>
>> We also just removed Hadoop 2.5 support. I know there was talk about
>> removing Python 2.6. I have no opinion on that myself, but, might be time
>> to revive that conversation too.
>>
>
>
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to