Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/19369#discussion_r141661720 --- Diff: core/src/main/scala/org/apache/spark/storage/DiskStore.scala --- @@ -49,7 +49,7 @@ private[spark] class DiskStore( private val minMemoryMapBytes = conf.getSizeAsBytes("spark.storage.memoryMapThreshold", "2m") private val blockSizes = new ConcurrentHashMap[String, Long]() - def getSize(blockId: BlockId): Long = blockSizes.get(blockId.name) + def getSize(blockId: BlockId): Long = blockSizes.get(blockId) --- End diff -- Hm, how does this work if the map keys are still strings?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org