I don't recall there being a consensus to deprecate all but 2 versions of
Spark. I think the confusion may be because that's what Flink versions are
supported in that community. For Spark, I think we will need to support
older versions until most people are able to move off of them, which can
take a long time. But as versions age, we should definitely try to spend
less time maintaining them! Hopefully our new structure helps us get to
that point.

On Tue, Oct 26, 2021 at 12:08 PM Wing Yew Poon <wyp...@cloudera.com.invalid>
wrote:

> Thanks Sam. Was there also agreement to deprecate Spark 3.0 support and go
> with supporting the latest 2 versions of Spark 3?
>
>
> On Tue, Oct 26, 2021 at 11:36 AM Sam Redai <s...@tabular.io> wrote:
>
>> If I remember correctly, we landed on option 1, creating a v3.1 without
>> the extra reflection logic and then just deprecating 3.0 when the time
>> comes. If everyone agrees with that I can amend the notes to describe that
>> more explicitly.
>>
>> -Sam
>>
>> On Mon, Oct 25, 2021 at 11:30 AM Wing Yew Poon
>> <wyp...@cloudera.com.invalid> wrote:
>>
>>> Adding v3.2 to Spark Build Refactoring
>>>>
>>>>    -
>>>>
>>>>    Russell and Anton will coordinate on dropping in a Spark 3.2 module
>>>>    -
>>>>
>>>>    We currently have 3.1 in the `spark3` module. We’ll move that out
>>>>    to its own module and mirror what we do with the 3.2 module. (This will
>>>>    enable cleaning up some mixed 3.0/3.1 code)
>>>>
>>>> Hi,
>>> I'm sorry I missed the last sync and only have these meeting minutes to
>>> go by.
>>> A Spark 3.2 module has now been added. Is the plan still to add a Spark
>>> 3.1 module. Will we have v3.0, v3.1 and v3.2 subdirectories under spark/ ?
>>> I think when we first started discussing the issue for Spark 3 support
>>> and how to organize the code, the proposal was to support two versions?
>>> IMO, for maintainability, we should only support two versions of Spark
>>> 3. However, in this transition period, I can see two approaches:
>>> 1. Create a v3.1 subdirectory, remove the reflection workarounds for its
>>> code, add explicit 3.1-specific modules, and build and test against 3.1. We
>>> then have 3 Spark 3 versions. At the next release, deprecate Spark 3.0
>>> support and remove the v3.0 directory and its modules.
>>> 2. Support Spark 3.1 and 3.0 from the common 3.0-based code. At the next
>>> release, deprecate Spark 3.0 support, rename v3.0 to v3.1, and update its
>>> code to remove the reflection workarounds.
>>> As I said, I missed the meeting. Perhaps 1 is the plan that was decided?
>>> (If it is, I'm willing to take on the work. I just need to know the plan.)
>>> Thanks,
>>> Wing Yew
>>>
>>>
>>>

-- 
Ryan Blue
Tabular

Reply via email to