srowen commented on a change in pull request #26911: Remove usage of Guava that 
breaks in 27; replace with workalikes
URL: https://github.com/apache/spark/pull/26911#discussion_r358416331
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/SparkEnv.scala
 ##########
 @@ -76,7 +76,8 @@ class SparkEnv (
 
   // A general, soft-reference map for metadata needed during HadoopRDD split 
computation
   // (e.g., HadoopFileRDD uses this to cache JobConfs and InputFormats).
-  private[spark] val hadoopJobMetadata = new 
MapMaker().softValues().makeMap[String, Any]()
+  private[spark] val hadoopJobMetadata =
 
 Review comment:
   This object was really a cache anyway and it was recommended to use 
CacheBuilder instead of MapMaker.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to