>
> Adding v3.2 to Spark Build Refactoring
>
>    -
>
>    Russell and Anton will coordinate on dropping in a Spark 3.2 module
>    -
>
>    We currently have 3.1 in the `spark3` module. We’ll move that out to
>    its own module and mirror what we do with the 3.2 module. (This will enable
>    cleaning up some mixed 3.0/3.1 code)
>
> Hi,
I'm sorry I missed the last sync and only have these meeting minutes to go
by.
A Spark 3.2 module has now been added. Is the plan still to add a Spark 3.1
module. Will we have v3.0, v3.1 and v3.2 subdirectories under spark/ ?
I think when we first started discussing the issue for Spark 3 support and
how to organize the code, the proposal was to support two versions?
IMO, for maintainability, we should only support two versions of Spark 3.
However, in this transition period, I can see two approaches:
1. Create a v3.1 subdirectory, remove the reflection workarounds for its
code, add explicit 3.1-specific modules, and build and test against 3.1. We
then have 3 Spark 3 versions. At the next release, deprecate Spark 3.0
support and remove the v3.0 directory and its modules.
2. Support Spark 3.1 and 3.0 from the common 3.0-based code. At the next
release, deprecate Spark 3.0 support, rename v3.0 to v3.1, and update its
code to remove the reflection workarounds.
As I said, I missed the meeting. Perhaps 1 is the plan that was decided?
(If it is, I'm willing to take on the work. I just need to know the plan.)
Thanks,
Wing Yew

Reply via email to