Okay I put the props in the spark-defaults, but they are not recognized,
as they don't appear in the 'Environment' tab during a application
execution.
spark.eventLog.enabled for example.
Am 01.04.2016 um 21:22 schrieb Ted Yu:
> Please
> read
>
Please read
https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties
w.r.t. spark-defaults.conf
On Fri, Apr 1, 2016 at 12:06 PM, Max Schmidt wrote:
> Yes but doc doesn't say any word for which variable the configs are valid,
> so do I have
Yes but doc doesn't say any word for which variable the configs are
valid, so do I have to set them for the history-server? The daemon? The
workers?
And what if I use the java API instead of spark-submit for the jobs?
I guess that the spark-defaults.conf are obsolete for the java API?
Am
You can set them in spark-defaults.conf
See also https://spark.apache.org/docs/latest/configuration.html#spark-ui
On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt wrote:
> Can somebody tell me the interaction between the properties:
>
> spark.ui.retainedJobs
>
Can somebody tell me the interaction between the properties:
spark.ui.retainedJobs
spark.ui.retainedStages
spark.history.retainedApplications
I know from the bugtracker, that the last one describes the number of
applications the history-server holds in memory.
Can I set the properties in the