Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21709#discussion_r200219590
  
    --- Diff: core/src/main/scala/org/apache/spark/metrics/MetricsConfig.scala 
---
    @@ -129,8 +131,11 @@ private[spark] class MetricsConfig(conf: SparkConf) 
extends Logging {
         var is: InputStream = null
         try {
           is = path match {
    -        case Some(f) => new FileInputStream(f)
    -        case None => 
Utils.getSparkClassLoader.getResourceAsStream(DEFAULT_METRICS_CONF_FILENAME)
    +        case Some(f) =>
    +          val hadoopPath = new Path(Utils.resolveURI(f))
    +          Utils.getHadoopFileSystem(hadoopPath.toUri, new 
Configuration()).open(hadoopPath)
    --- End diff --
    
    You should use `Configuration` object in `SparkHadoopUtil`, rather than 
creating a new one here. This created `Configuration` object may miss some 
configurations set via "spark.hadoop.xxxx".


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to