Chiming in late, but my take on this line of argument is: these
companies are welcome to keep using Spark 1.x. If anything the
argument here is about how long to maintain 1.x, and indeed, it's
going to go dormant quite soon.

But using RHEL 6 (or any old-er version of any platform) and not
wanting to update already means you prefer stability more than change.
I don't receive an expectation that major releases of major things
support older major releases of other things.

Conversely: supporting something in Spark 2.x means making sure
nothing breaks compatibility with it for a couple years. This is
effort than can be spent elsewhere; this has to be weighed.

(For similar reasons I personally don't favor supporting Java 7 or
Scala 2.10 in Spark 2.x.)

On Tue, Jan 5, 2016 at 7:07 PM, Koert Kuipers <ko...@tresata.com> wrote:
> rhel/centos 6 ships with python 2.6, doesnt it?
>
> if so, i still know plenty of large companies where python 2.6 is the only
> option. asking them for python 2.7 is not going to work
>
> so i think its a bad idea
>
> On Tue, Jan 5, 2016 at 1:52 PM, Juliet Hougland <juliet.hougl...@gmail.com>
> wrote:
>>
>> I don't see a reason Spark 2.0 would need to support Python 2.6. At this
>> point, Python 3 should be the default that is encouraged.
>> Most organizations acknowledge the 2.7 is common, but lagging behind the
>> version they should theoretically use. Dropping python 2.6
>> support sounds very reasonable to me.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to