Hi, Michael.

I'm not sure Apache Spark is in the status close to what you want.

First, both Apache Spark 3.0.0-preview and Apache Spark 2.4 is using Avro
1.8.2. Also, `master` and `branch-2.4` branch does. Cutting new releases do
not provide you what you want.

Do we have a PR on the master branch? Otherwise, before starting to discuss
the releases, could you make a PR first on the master branch? For Parquet,
it's the same.

Second, we want to provide Apache Spark 3.0.0 as compatible as possible.
The incompatible change could be a reason for rejection even in `master`
branch for Apache Spark 3.0.0.

Lastly, we may consider backporting if it lands at `master` branch for 3.0.
However, as Nan Zhu said, the dependency upgrade backporting PR is -1 by
default. Usually, it's allowed only for those serious cases like
security/production outage.

Bests,
Dongjoon.


On Fri, Nov 22, 2019 at 9:00 AM Ryan Blue <rb...@netflix.com.invalid> wrote:

> Just to clarify, I don't think that Parquet 1.10.1 to 1.11.0 is a
> runtime-incompatible change. The example mixed 1.11.0 and 1.10.1 in the
> same execution.
>
> Michael, please be more careful about announcing compatibility problems in
> other communities. If you've observed problems, let's find out the root
> cause first.
>
> rb
>
> On Fri, Nov 22, 2019 at 8:56 AM Michael Heuer <heue...@gmail.com> wrote:
>
>> Hello,
>>
>> Avro 1.8.2 to 1.9.1 is a binary incompatible update, and it appears that
>> Parquet 1.10.1 to 1.11 will be a runtime-incompatible update (see thread on
>> dev@parquet
>> <https://mail-archives.apache.org/mod_mbox/parquet-dev/201911.mbox/%3c8357699c-9295-4eb0-a39e-b3538d717...@gmail.com%3E>
>> ).
>>
>> Might there be any desire to cut a Spark 2.4.5 release so that users can
>> pick up these changes independently of all the other changes in Spark 3.0?
>>
>> Thank you in advance,
>>
>>    michael
>>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>

Reply via email to