Github user HeartSaVioR commented on a diff in the pull request: https://github.com/apache/spark/pull/22598#discussion_r224322849 --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala --- @@ -647,4 +647,42 @@ package object config { .stringConf .toSequence .createWithDefault(Nil) + + private[spark] val KAFKA_DELEGATION_TOKEN_ENABLED = + ConfigBuilder("spark.kafka.delegation.token.enabled") + .doc("Set to 'true' for obtaining delegation token from kafka.") + .booleanConf + .createWithDefault(false) + + private[spark] val KAFKA_BOOTSTRAP_SERVERS = + ConfigBuilder("spark.kafka.bootstrap.servers") --- End diff -- While it is not possible to provide relevant configuration to source/sink, pre-defining Kafka related configurations one-by-one in here feels me as being too coupled with Kafka. It might also give confusion on where to put configuration on Kafka source/sink: this configuration must be only used for delegation token, but I can't indicate it from both configuration name as well as its doc. My 2 cents is just reserving prefix `spark.kafka.token` or similar, and leave a comment and don't define anything here. Would like to hear how committers think about how to add external configurations on Spark conf.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org