Pull request is ready to go: https://github.com/apache/spark/pull/19134
I flag it one more time because it means Kafka 0.8 is deprecated in 2.3.0
and because it will require -Pkafka-0-8 to build in the support now.
Pardon, I want to be sure: does this mean Pyspark Kafka support effectively
has
For those following along, see discussions at
https://github.com/apache/spark/pull/19134
It's now also clear that we'd need to remove Kafka 0.8 examples if Kafka
0.8 becomes optional. I think that's all reasonable but the change is
growing beyond just putting it behind a profile.
On Wed, Sep 6,
I kind of doubt the kafka 0.10 integration is going to change much at
all before the upgrade to 0.11
On Wed, Sep 6, 2017 at 8:57 AM, Sean Owen wrote:
> Thanks, I can do that. We're then in the funny position of having one
> deprecated Kafka API, and one experimental one.
>
>
Thanks, I can do that. We're then in the funny position of having one
deprecated Kafka API, and one experimental one.
Is the Kafka 0.10 integration as stable as it is going to be, and worth
marking as such for 2.3.0?
On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger wrote:
> +1
+1
From: Cody Koeninger <c...@koeninger.org>
Sent: Tuesday, September 5, 2017 8:12:07 AM
To: Sean Owen
Cc: dev
Subject: Re: Putting Kafka 0.8 behind an (opt-in) profile
+1 to going ahead and giving a deprecation warning now
On Tue, Sep 5, 2017 at 6:39 AM
+1 to going ahead and giving a deprecation warning now
On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen wrote:
> On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional in
> the build, because it is not available for Scala 2.12.
>
>
On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional in
the build, because it is not available for Scala 2.12.
https://github.com/apache/spark/pull/19134 adds that profile. I mention it
because this means that Kafka 0.8 becomes "opt-in" and has to be explicitly
enabled, and