+1 for a 0.8.0 release with Spark 2.4 and then move on for Spark 3.0 when
it's ready.

On Tue, 3 Mar 2020 at 16:32, Ryan Blue <rb...@netflix.com.invalid> wrote:

> Thanks for bringing this up, Saisai. I tried to do this a couple of months
> ago, but ran into a problem with dependency locks. I couldn't get two
> different versions of Spark packages in the build with baseline, but maybe
> I was missing something. If you can get it working, I think it's a great
> idea to get this into master.
>
> Otherwise, I was thinking about proposing an 0.8.0 release in the next
> month or so based on Spark 2.4. Then we could merge the branch into master
> and do another release for Spark 3.0 when it's ready.
>
> rb
>
> On Tue, Mar 3, 2020 at 6:07 AM Saisai Shao <sai.sai.s...@gmail.com> wrote:
>
>> Hi team,
>>
>> I was thinking of merging spark-3 branch into master, also per the
>> discussion before we could make spark-2 and spark-3 coexisted into 2
>> different sub-modules. With this, one build could generate both spark-2 and
>> spark-3 runtime jars, user could pick either at preference.
>>
>> One concern is that they share lots of common code in read/write path,
>> this will increase the maintenance overhead to keep consistency of two
>> copies.
>>
>> So I'd like to hear your thoughts, any suggestions on it?
>>
>> Thanks
>> Saisai
>>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>

Reply via email to