[ 
https://issues.apache.org/jira/browse/SPARK-30222?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16994085#comment-16994085
 ] 

Jungtaek Lim edited comment on SPARK-30222 at 12/12/19 2:28 AM:
----------------------------------------------------------------

-That config is renamed from "spark.kafka.consumer.cache.capacity" to 
"spark.sql.kafkaConsumerCache.capacity" in Spark 3.0. Please use 
"spark.kafka.consumer.cache.capacity" instead.-

I'm closing this. Please reopen this if the problem persists after changing 
your configuration key.

EDIT: "spark.sql.kafkaConsumerCache.capacity" is for Spark less than 3.0 so the 
configuration name is correct.


was (Author: kabhwan):
That config is renamed from "spark.kafka.consumer.cache.capacity" to 
"spark.sql.kafkaConsumerCache.capacity" in Spark 3.0. Please use 
"spark.kafka.consumer.cache.capacity" instead.

I'm closing this. Please reopen this if the problem persists after changing 
your configuration key.

> Still getting KafkaConsumer cache hitting max capacity of 64, removing 
> consumer for CacheKe
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-30222
>                 URL: https://issues.apache.org/jira/browse/SPARK-30222
>             Project: Spark
>          Issue Type: Bug
>          Components: Structured Streaming
>    Affects Versions: 2.4.1
>         Environment: {{Below are the logs.}}
> 2019-12-11 08:33:31,504 [Executor task launch worker for task 1050] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-21)
> 2019-12-11 08:33:32,493 [Executor task launch worker for task 1051] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-5)
> 2019-12-11 08:33:32,570 [Executor task launch worker for task 1052] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-9)
> 2019-12-11 08:33:33,441 [Executor task launch worker for task 1053] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-17)
> 2019-12-11 08:33:33,619 [Executor task launch worker for task 1054] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-2)
> 2019-12-11 08:33:34,474 [Executor task launch worker for task 1055] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-10)
> 2019-12-11 08:33:35,006 [Executor task launch worker for task 1056] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-6)
> 2019-12-11 08:33:36,326 [Executor task launch worker for task 1057] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-14)
> 2019-12-11 08:33:36,634 [Executor task launch worker for task 1058] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-0)
> 2019-12-11 08:33:37,496 [Executor task launch worker for task 1059] WARN 
> org.apache.spark.sql.kafka010.KafkaDataConsumer - KafkaConsumer cache hitting 
> max capacity of 64, removing consumer for 
> CacheKey(spark-kafka-source-93ee3689-79f9-42e8-b1ee-e856570205ae-1923743483-executor,COMPANY_TRANSACTIONS_INBOUND-19)
> 2019-12-11 08:33:39,183 [stream execution thread for [id = 
> b3aec196-e4f2-4ef9-973b-d5685eba917e, runId = 
> 5c35a63a-16ad-4899-b732-1019397770bd]] WARN 
> org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor - Current 
> batch is falling behind. The trigger interval is 15000 milliseconds, but 
> spent 63438 milliseconds
>            Reporter: Shyam
>            Priority: Major
>
> Me using spark-sql-2.4.1 version with Kafka 0.10 v.
> While I try to consume data by consumer. it gives error below even after 
> setting 
> .option("spark.sql.kafkaConsumerCache.capacity",128)
>  
> {{Dataset<Row> df = sparkSession}}
> {{       .readStream()}}
> {{       .format("kafka")}}
> {{       .option("kafka.bootstrap.servers", SERVERS)}}
> {{       .option("subscribe", TOPIC) }}{{}}
> {{       .option("spark.sql.kafkaConsumerCache.capacity",128)   }}
> {{ }}
> {{       .load();}}
> {{}}
> {{}}
> {{}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to