Re: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey

2019-10-21 Thread Gabor Somogyi
With the mentioned parameter the capacity can be increased but the main
question is more like why is that happening.
Even on a really beefy machine having more than 64 consumers is quite
extreme. Maybe better horizontal scaling (more executors) would be a better
option to reach maximum performance.

BR,
G


On Mon, Oct 21, 2019 at 3:20 PM peter  wrote:

> You can try improve setting
> spark.streaming.kafka.consumer.cache.maxCapacity
>
>
>
> *发件人:* Shyam P [mailto:shyamabigd...@gmail.com]
> *发送时间:* 2019年10月21日 20:43
> *收件人:* kafka-clie...@googlegroups.com; spark users 
> *主题:* Issue : KafkaConsumer cache hitting max capacity of 64, removing
> consumer for CacheKey
>
>
>
> Hi ,
>
>  I am using spark-sql-2.4.1v with kafka
>
>
>
> I am facing slow consumer issue
>
> I see warning "KafkaConsumer cache hitting max capacity of 64, removing
> consumer for
> CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)"
> in logs
>
>
>
>
>
>   more on the same
>
>
> https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci
>
>
>
>
>can  anyone please advice how to fix this and improve my consumer
> performance?
>
>
>
>
>
> Thank you.
>
>
>


答复: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey

2019-10-21 Thread peter
You can try improve setting spark.streaming.kafka.consumer.cache.maxCapacity

 

发件人: Shyam P [mailto:shyamabigd...@gmail.com] 
发送时间: 2019年10月21日 20:43
收件人: kafka-clie...@googlegroups.com; spark users 
主题: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer 
for CacheKey

 

Hi ,

 I am using spark-sql-2.4.1v with kafka

 

I am facing slow consumer issue 

I see warning "KafkaConsumer cache hitting max capacity of 64, removing 
consumer for 
CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)"
 in logs

 

 

  more on the same 

https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci
  

 

   can  anyone please advice how to fix this and improve my consumer 
performance?

 

 

Thank you.

 



Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey

2019-10-21 Thread Shyam P
Hi ,
 I am using spark-sql-2.4.1v with kafka

I am facing slow consumer issue
I see warning "KafkaConsumer cache hitting max capacity of 64, removing
consumer for
CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)"
in logs


  more on the same
https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci


   can  anyone please advice how to fix this and improve my consumer
performance?


Thank you.