You can try improve setting spark.streaming.kafka.consumer.cache.maxCapacity
发件人: Shyam P [mailto:shyamabigd...@gmail.com] 发送时间: 2019年10月21日 20:43 收件人: kafka-clie...@googlegroups.com; spark users <user@spark.apache.org> 主题: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey Hi , I am using spark-sql-2.4.1v with kafka I am facing slow consumer issue I see warning "KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)" in logs more on the same https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci can anyone please advice how to fix this and improve my consumer performance? Thank you.