Hi,

I'm new to Spark. For my application I need to overwrite Hadoop
configurations (Can't change Configurations in Hadoop as it might affect my
regular HDFS), so that Namenode IPs gets automatically resolved.What are
the ways to do so. I tried giving "spark.hadoop.dfs.ha.namenodes.nn",
"spark.hadoop.dfs.namenode.rpc-address.nn",
"spark.hadoop.dfs.namenode.http-address.nn" and other core-site & hdfs-site
conf properties in SparkConf Object. But still i get UnknownHostException.

Regards
Vinoth Sankar

Reply via email to