Hi,
I know we can set  hadoop_conf_dir in spark-env.sh , but we want to set hadoop_conf_dir and hive_home for spark in java code to match different cluster , is there a way to set spark-env in program ?

Thanks for any replys

Reply via email to