Hi,

by following this article I managed to consume messages from Kafka 0.10 in
Spark 2.0:
http://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

However, the Java examples are missing and I would like to commit the
offset myself after processing the RDD. Does anybody have a working example
for me? "offsetRanges" seems to be a trait and not available after casting
the RDD to "HasOffsetRanges"

Thanks a lot!

Scala example:

val offsets = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
stream.asInstanceOf[CanCommitOffsets].commitAsync(offsets)

Reply via email to