Github user HeartSaVioR commented on the issue: https://github.com/apache/spark/pull/21469 My series of patches could be possible based on two metrics: `size for memory usage of latest version` and `size for total memory usage of loaded versions`. SPARK-24717 (#21700) enabled the possibility to tune the overall state memory usage, and if end users have either one metric they couldn't tune it. IMHO, I'm not 100% sure how much this patch provides confusion to the end users, but if the intention of `memoryUsedBytes` is for measuring overall state partition, what about replacing `memoryUsedBytes` as `size for total memory usage of loaded versions`, but also placing `size for memory usage of latest version` to custom metric?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org