Please read
https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties
w.r.t. spark-defaults.conf

On Fri, Apr 1, 2016 at 12:06 PM, Max Schmidt <m...@datapath.io> wrote:

> Yes but doc doesn't say any word for which variable the configs are valid,
> so do I have to set them for the history-server? The daemon? The workers?
>
> And what if I use the java API instead of spark-submit for the jobs?
>
> I guess that the spark-defaults.conf are obsolete for the java API?
>
>
> Am 2016-04-01 18:58, schrieb Ted Yu:
>
>> You can set them in spark-defaults.conf
>>
>> See also https://spark.apache.org/docs/latest/configuration.html#spark-ui
>> [1]
>>
>> On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt <m...@datapath.io> wrote:
>>
>> Can somebody tell me the interaction between the properties:
>>>
>>> spark.ui.retainedJobs
>>> spark.ui.retainedStages
>>> spark.history.retainedApplications
>>>
>>> I know from the bugtracker, that the last one describes the number of
>>> applications the history-server holds in memory.
>>>
>>> Can I set the properties in the spark-env.sh? And where?
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>>
>>
>> Links:
>> ------
>> [1] https://spark.apache.org/docs/latest/configuration.html#spark-ui
>>
>
>
>
>

Reply via email to