LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1119559814
> I give a new [draft](https://github.com/apache/spark/pull/36467/files) pr
to replace `finalize()` with a `ReferenceQueue+Daemon Thread` for
`RocksDBIterator` cleanup and try to avoi
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1119462201
I give a new [draft](https://github.com/apache/spark/pull/36467/files) pr to
replace `finalize()` with a `ReferenceQueue+Daemon Thread` for
`RocksDBIterator` cleanup and try to avoid
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1119217950
> SoftReference still allows it to be reclaimed on a full GC. If that helps,
I think it's OK to change, as we do not expect many open iterators at any one
time. Does that help the iss
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1114921698
> Change `WeakReference` in `iteratorTracker` to strong reference should
avoid the issue I mentioned above, all `LevelDB/RockDBIterator` not explicitly
closed in Spark code will be cl
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1114905055
Change `WeakReference` in `iteratorTracker` to strong reference should avoid
the issue I mentioned above, all `LevelDB/RockDBIterator` not explicitly
closed in Spark code will be clo
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1114466277
https://github.com/apache/spark/blob/5630f700768432396a948376f5b46b00d4186e1b/common/kvstore/src/main/java/org/apache/spark/util/kvstore/RocksDB.java#L332-L356
Actually, if use
LuciferYang commented on PR #36403:
URL: https://github.com/apache/spark/pull/36403#issuecomment-1113994582
https://github.com/LuciferYang/spark/compare/SPARK-39063...LuciferYang:SPARK-39063-backup?expand=1
I added an additional global Tracker to audit the `create` and `close`
operat