Hi,
I have seted properties in conf/spark-defaults.conf and start with command
./sbin/start-history-server.sh /tmp/spark-events. It get errors and seems
that the properties in spark-defaults.conf file doesn't effect. How can I
solve this problem(Enable properties in spark-defaults.conf
The history server (and other Spark daemons) do not read
spark-defaults.conf. There's a bug open to implement that
(SPARK-2098), and an open PR to fix it, but it's still not in Spark.
On Wed, Sep 3, 2014 at 11:00 AM, Zhanfeng Huo huozhanf...@gmail.com wrote:
Hi,
I have seted properties in
Hi Zhanfeng,
You will need to set these through SPARK_HISTORY_OPTS in conf/spark-env.sh.
This is documented here: http://spark.apache.org/docs/latest/monitoring.html
.
Let me know if you have it working,
-Andrew
2014-09-03 11:14 GMT-07:00 Marcelo Vanzin van...@cloudera.com:
The history
Thanks for your help.
It works after setting SPARK_HISTORY_OPTS.
Zhanfeng Huo
From: Andrew Or
Date: 2014-09-04 07:52
To: Marcelo Vanzin
CC: Zhanfeng Huo; user
Subject: Re: How can I start history-server with kerberos HDFS ?
Hi Zhanfeng,
You will need to set these through SPARK_HISTORY_OPTS