Github user zsxwing commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20703#discussion_r171690388
  
    --- Diff: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchReader.scala
 ---
    @@ -76,6 +76,10 @@ private[kafka010] class KafkaMicroBatchReader(
       private val maxOffsetsPerTrigger =
         Option(options.get("maxOffsetsPerTrigger").orElse(null)).map(_.toLong)
     
    +  private val useConsumerCache = options.getBoolean(
    +    "kafkaConsumer.useConsumerCache",
    +    
SparkEnv.get.conf.getBoolean("spark.streaming.kafka.consumer.cache.enabled", 
true))
    --- End diff --
    
    We usually don't reuse configurations from DStream. You can add a new one 
staring with `spark.sql.streaming` in SQLConf.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to