[ 
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15174856#comment-15174856
 ] 

Mark Grover commented on SPARK-12177:
-------------------------------------

Hi [~tdas] and [~rxin], can you help us with your opinion on these questions, 
so we can unblock this work:
1. Should we support both Kafka 0.8 and 0.9 or just 0.9? The pros and cons are 
listed [here|https://github.com/apache/spark/pull/11143#issuecomment-182154267] 
along with what other projects are doing.
2. Should we make a separate project for the implementation using the new kafka 
consumer API with the same class names (e.g. KafkaRDD, etc.), or create new 
classes like hadoop did, in the same subproject e.g. NewKafkaRDD, etc.

Thanks!

> Update KafkaDStreams to new Kafka 0.9 Consumer API
> --------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not 
> compatible with old one. So, I added new consumer api. I made separate 
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I 
> didn't remove old classes for more backward compatibility. User will not need 
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to