[ https://issues.apache.org/jira/browse/SPARK-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Patrick Wendell updated SPARK-7807: ----------------------------------- Component/s: Spark Core > High-Availablity:: SparkHadoopUtil.scala should support > hadoopConfiguration.addResource() > ------------------------------------------------------------------------------------------ > > Key: SPARK-7807 > URL: https://issues.apache.org/jira/browse/SPARK-7807 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Environment: running spark against remote-hadoop HA cluster. Easy of > use with spark.hadoop.url. prefix. > 1) user can support sparkConf with prefix spark.hadoop.url. like > spark.hadoop.url.core-site > and spark.hadoop.url.hdfs-site > Reporter: Norman He > Priority: Trivial > Labels: easyfix > > line 97 : should below should be able to change to > conf.getAll.foreach { case (key, value) => > if (key.startsWith("spark.hadoop.")) { > hadoopConf.set(key.substring("spark.hadoop.".length), value) > } > } > ----------------new version------------------------------- > conf.getAll.foreach { case (key, value) => > if (key.startsWith("spark.hadoop.")) { > if( key.startsWith("spark.hadoop.url.")) > hadoopConf.addResource(new URL(value)) > else > hadoopConf.set(key.substring("spark.hadoop.".length), value) > } > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org