[ 
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15328699#comment-15328699
 ] 

Mark Grover commented on SPARK-12177:
-------------------------------------

Hi Ismael and Cody,
My personal opinion was to hold off because a) The new consumer API was still 
marked as beta, and so I wasn't sure of the compatibility guarantees, which 
Kafka did seem to break a little (as discussed 
[here|http://mail-archives.apache.org/mod_mbox/kafka-dev/201605.mbox/%3CCAKm=r7v5jgg9qxgjioczdph9vej57m46ngy_626kiq-ovdx...@mail.gmail.com%3E])
 b) the real benefit is security - I am personally a little more biased towards 
authentication (Kerberos) than encryption, so I was just waiting for delegation 
tokens to land. 

Now, that 0.10.0 is released, there's a good chance delegation tokens would 
land in Kafka 0.11.0, and the new consumer API is marked stable, I am more open 
to this PR being merged, it's been around for too long anyways. Cody, what do 
you say? Any reason you'd want to wait? If not, we can make a case for this 
going in now.

As far the logistics of whether this belongs in Apache Bahir or not - today, I 
don't have a strong opinion on where kafka integration should reside. What I do 
feel strongly about, like Cody said, is that the old consumer API integration 
and new consumer API integration should reside in the same place. Since the old 
integration is in Spark, that's where the new makes sense. If a vote on Apache 
Spark results in Kafka integration to be taken out, both the new and the old in 
Apache Bahir would make sense.

> Update KafkaDStreams to new Kafka 0.10 Consumer API
> ---------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not 
> compatible with old one. So, I added new consumer api. I made separate 
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I 
> didn't remove old classes for more backward compatibility. User will not need 
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to