[ 
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15232828#comment-15232828
 ] 

Cody Koeninger commented on SPARK-12177:
----------------------------------------

Ok, since SPARK-13877 has been rejected and we're keeping the Kafka dstreams in 
spark, I'd like to get this moving again.

I've done some basic throughput testing on my PR using 
kafka-producer-perf-test.sh to generate load, and after some tweaks the 
performance is comparable to the existing direct stream.  I've made sure my 
existing transactional / idempotent examples work with the new consumer.

I don't yet have a compelling need to move any of my production jobs to the new 
consumer, but it's at the point that I'd feel comfortable with other people 
testing it out.

Given the issues I've seen with 0.9 / 0.10 (e.g. KAFKA-3135), I'm 100% sure 
that we want this in a totally separate subproject from the existing dstream, 
which should be left at 0.8.

> Update KafkaDStreams to new Kafka 0.9 Consumer API
> --------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not 
> compatible with old one. So, I added new consumer api. I made separate 
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I 
> didn't remove old classes for more backward compatibility. User will not need 
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to