Skip to site navigation (Press enter)
programmatically set hadoop_conf_dir for spark
数据与人工智能产品开发部
Thu, 15 Nov 2018 17:45:01 -0800
Hi,
I know we can set
hadoop_conf_dir in spark-env.sh , but we want to set hadoop_conf_dir and hive_home for spark in java code to match
different cluster , is there a way to set spark-env in program ?
Thanks for any replys
0049003208
0049003...@znv.com
签名由
网易邮箱大师
定制
Previous message
View by thread
View by date
Next message
Reply via email to
Search the site
The Mail Archive home
dev - all messages
dev - about the list
Expand
Previous message
Next message