I kind of doubt the kafka 0.10 integration is going to change much at
all before the upgrade to 0.11

On Wed, Sep 6, 2017 at 8:57 AM, Sean Owen <so...@cloudera.com> wrote:
> Thanks, I can do that. We're then in the funny position of having one
> deprecated Kafka API, and one experimental one.
>
> Is the Kafka 0.10 integration as stable as it is going to be, and worth
> marking as such for 2.3.0?
>
>
> On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger <c...@koeninger.org> wrote:
>>
>> +1 to going ahead and giving a deprecation warning now
>>
>> On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
>> > On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional
>> > in
>> > the build, because it is not available for Scala 2.12.
>> >
>> > https://github.com/apache/spark/pull/19134  adds that profile. I mention
>> > it
>> > because this means that Kafka 0.8 becomes "opt-in" and has to be
>> > explicitly
>> > enabled, and that may have implications for downstream builds.
>> >
>> > Yes, we can add <activeByDefault>true</activeByDefault>. It however only
>> > has
>> > effect when no other profiles are set, which makes it more deceptive
>> > than
>> > useful IMHO. (We don't use it otherwise.)
>> >
>> > Reviewers may want to check my work especially as regards the Python
>> > test
>> > support and SBT build.
>> >
>> >
>> > Another related question is: when is 0.8 support deprecated, removed? It
>> > seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
>> > driver is that Kafka 0.11 and 1.0 will possibly require yet another
>> > variant
>> > of streaming support (not sure yet), and 3 versions is too many.
>> > Deprecating
>> > now opens more options sooner.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to