dongjoon-hyun commented on a change in pull request #23412: [SPARK-26477][CORE] Use ConfigEntry for hardcoded configs for unsafe category URL: https://github.com/apache/spark/pull/23412#discussion_r247336496
########## File path: core/src/main/scala/org/apache/spark/internal/config/package.scala ########## @@ -732,6 +732,21 @@ package object config { .checkValue(v => v > 0, "The max failures should be a positive value.") .createWithDefault(40) + private[spark] val UNSAFE_EXCEPTION_ON_MEMORY_LEAK = + ConfigBuilder("spark.unsafe.exceptionOnMemoryLeak") + .booleanConf + .createWithDefault(false) + + private[spark] val UNSAFE_SORTER_SPILL_READ_AHEAD_ENABLED = + ConfigBuilder("spark.unsafe.sorter.spill.read.ahead.enabled") + .booleanConf + .createWithDefault(true) + + private[spark] val UNSAFE_SORTER_SPILL_READER_BUFFER_SIZE = + ConfigBuilder("spark.unsafe.sorter.spill.reader.buffer.size") + .bytesConf(ByteUnit.BYTE) Review comment: Shall we add `checkValue`? Previously, the values beyond boundary are replaced back to the default value with warnings in `UnsafeSorterSpillReader`. It would be great to check here since we have a configuration now. ``` if (bufferSizeBytes > MAX_BUFFER_SIZE_BYTES || bufferSizeBytes < DEFAULT_BUFFER_SIZE_BYTES) { ``` Hi, @gatorsmile . If we use `checkValue` here, do we need to add a behavior change doc? IMO, these three configuration can be `.internal()` configuration and we can skip the behavior change doc for these. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org