You can point to your custom HADOOP_CONF_DIR in your spark-env.sh Regards Sab On 01-Oct-2015 5:22 pm, "Vinoth Sankar" <vinoth9...@gmail.com> wrote:
> Hi, > > I'm new to Spark. For my application I need to overwrite Hadoop > configurations (Can't change Configurations in Hadoop as it might affect my > regular HDFS), so that Namenode IPs gets automatically resolved.What are > the ways to do so. I tried giving "spark.hadoop.dfs.ha.namenodes.nn", > "spark.hadoop.dfs.namenode.rpc-address.nn", > "spark.hadoop.dfs.namenode.http-address.nn" and other core-site & hdfs-site > conf properties in SparkConf Object. But still i get UnknownHostException. > > Regards > Vinoth Sankar >