Hello fellow sparkers,

I'm using spark to consume messages from kafka in a non streaming fashion.
I'm suing the using spark-streaming-kafka-0-8_2.10 & sparkv2.0to do the
same.

I have a few queries for the same, please get back if you guys have clues
on the same.

1) Is there anyway to get the have the topic and partition & offset
information for each item from the KafkaRDD. I'm using the
*KafkaUtils.createRDD[String,
String, StringDecoder, StringDecoder]* to create my kafka RDD.
2) How to pass my custom Decoder instead of using the String or Byte
decoder are there any examples for the same?
3) is there a newer version to consumer from kafka-0.10 & kafka-0.9 clusters

-- 
Thanks & Regards,

*Mukesh Jha <me.mukesh....@gmail.com>*

Reply via email to