[ 
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15175999#comment-15175999
 ] 

Mark Grover commented on SPARK-12177:
-------------------------------------

I think the core of the question is a much broader Spark question - how many 
past versions to support?

To add some more color to the question at hand, Kafka has already decided that 
the [next version of Kafka will be 
0.10.0|https://github.com/apache/kafka/commit/b084c485e25bfe77154e805219b24714d59c396c]
 (instead of 0.9.1) and this next version will have yet another protocol 
change. So, where do we go on from there? Supporting Kafka 0.8, 0.9 and 0.10.0 
in 2.x?

I still think Spark 2.0 is a good time to drop support for Kafka 0.8.x. Other 
projects are doing it, that too, in their minor releases (links to Flume and 
Storm JIRAs are on the PR) and Kafka is moving fast with protocol changes in 
every new non-maintenance release and it will become a huge hassle to keep up 
with all the past releases.

> Update KafkaDStreams to new Kafka 0.9 Consumer API
> --------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not 
> compatible with old one. So, I added new consumer api. I made separate 
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I 
> didn't remove old classes for more backward compatibility. User will not need 
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to