[ 
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15188761#comment-15188761
 ] 

Praveen Devarao commented on SPARK-12177:
-----------------------------------------

Hi Cody,

The last time when I got TopicParition and OffsetAndMetadata class made 
serializable, the argument was that these classes are used by end-users and are 
metadata class which would be needed for checkpoint purpose.

As for ConsumerRecord,  this class is meant to hold the actual data and would 
usually be not needed for checkpoint purpose...if we need the data we can 
always go to respective offset in respective topic from respective partition. 
Also, the ConsumerRecord class has members which are of generic type (K and V) 
so really the serialization depends on what type of object is flowed in by the 
user and if that is serializable.

Given this, From Kafka perspective not sure how we can reason why would one 
want to mark this class as serializable.

Thanks

Praveen

> Update KafkaDStreams to new Kafka 0.9 Consumer API
> --------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not 
> compatible with old one. So, I added new consumer api. I made separate 
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I 
> didn't remove old classes for more backward compatibility. User will not need 
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to