Hi Kafka devs,
I come to you with a dilemma and a request.

Based on what I understand, users of Kafka need to upgrade their brokers to
Kafka 0.9.x first, before they upgrade their clients to Kafka 0.9.x.

However, that presents a problem to other projects that integrate with
Kafka (Spark, Flume, Storm, etc.). From here on, I will speak for Spark +
Kafka, since that's the one I am most familiar with.

In the light of compatibility (or the lack thereof) between 0.8.x and
0.9.x, Spark is faced with a problem of what version(s) of Kafka to be
compatible with, and has 2 options (discussed in this PR
<https://github.com/apache/spark/pull/11143>):
1. We either upgrade to Kafka 0.9, dropping support for 0.8. Storm and
Flume are already on this path.
2. We introduce complexity in our code to support both 0.8 and 0.9 for the
entire duration of our next major release (Apache Spark 2.x).

I'd love to hear your thoughts on which option, you recommend.

Long term, I'd really appreciate if Kafka could do something that doesn't
make Spark having to support two, or even more versions of Kafka. And, if
there is something that I, personally, and Spark project can do in your
next release candidate phase to make things easier, please do let us know.

Thanks!
Mark

Reply via email to