Hello,

I'm getting the below exception when testing Spark 2.0 with Kafka 0.10.

16/08/23 16:31:01 INFO AppInfoParser: Kafka version : 0.10.0.0
> 16/08/23 16:31:01 INFO AppInfoParser: Kafka commitId : b8642491e78c5a13
> 16/08/23 16:31:01 INFO CachedKafkaConsumer: Initial fetch for
> spark-executor-example mt_event 0 15782114
> 16/08/23 16:31:01 INFO AbstractCoordinator: Discovered coordinator
> 10.150.254.161:9233 (id: 2147483646 rack: null) for group
> spark-executor-example.
> 16/08/23 16:31:02 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID
> 6)
> java.lang.AssertionError: assertion failed: Failed to get records for
> spark-executor-example mt_event 0 15782114 after polling for 512
> at scala.Predef$.assert(Predef.scala:170)
> at
> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
> at
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:227)
> at
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:193)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>

I get this error intermittently. Sometimes a few batches are scheduled and
run fine. Then I get this error.
kafkacat is able to fetch from this topic continuously.

Full exception is here --
https://gist.github.com/SrikanthTati/c2e95c4ac689cd49aab817e24ec42767

Srikanth

Reply via email to