Re: Where to set properties for the retainedJobs/Stages?

2016-04-04 Thread Max Schmidt
Okay I put the props in the spark-defaults, but they are not recognized,
as they don't appear in the 'Environment' tab during a application
execution.

spark.eventLog.enabled for example.

Am 01.04.2016 um 21:22 schrieb Ted Yu:
> Please
> read 
> https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties
> w.r.t. spark-defaults.conf
>
> On Fri, Apr 1, 2016 at 12:06 PM, Max Schmidt  > wrote:
>
> Yes but doc doesn't say any word for which variable the configs
> are valid, so do I have to set them for the history-server? The
> daemon? The workers?
>
> And what if I use the java API instead of spark-submit for the jobs?
>
> I guess that the spark-defaults.conf are obsolete for the java API?
>
>
> Am 2016-04-01 18:58, schrieb Ted Yu:
>
> You can set them in spark-defaults.conf
>
> See
> also https://spark.apache.org/docs/latest/configuration.html#spark-ui
> [1]
>
> On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt  > wrote:
>
> Can somebody tell me the interaction between the properties:
>
> spark.ui.retainedJobs
> spark.ui.retainedStages
> spark.history.retainedApplications
>
> I know from the bugtracker, that the last one describes
> the number of
> applications the history-server holds in memory.
>
> Can I set the properties in the spark-env.sh? And where?
>
>
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> 
> For additional commands, e-mail:
> user-h...@spark.apache.org 
>
>
>
>
> Links:
> --
> [1]
> https://spark.apache.org/docs/latest/configuration.html#spark-ui
>
>
>
>
>

-- 
*Max Schmidt, Senior Java Developer* | m...@datapath.io
 | LinkedIn

Datapath.io
 
Decreasing AWS latency.
Your traffic optimized.

Datapath.io GmbH
Mainz | HRB Nr. 46222
Sebastian Spies, CEO



Re: Where to set properties for the retainedJobs/Stages?

2016-04-01 Thread Ted Yu
Please read
https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties
w.r.t. spark-defaults.conf

On Fri, Apr 1, 2016 at 12:06 PM, Max Schmidt  wrote:

> Yes but doc doesn't say any word for which variable the configs are valid,
> so do I have to set them for the history-server? The daemon? The workers?
>
> And what if I use the java API instead of spark-submit for the jobs?
>
> I guess that the spark-defaults.conf are obsolete for the java API?
>
>
> Am 2016-04-01 18:58, schrieb Ted Yu:
>
>> You can set them in spark-defaults.conf
>>
>> See also https://spark.apache.org/docs/latest/configuration.html#spark-ui
>> [1]
>>
>> On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt  wrote:
>>
>> Can somebody tell me the interaction between the properties:
>>>
>>> spark.ui.retainedJobs
>>> spark.ui.retainedStages
>>> spark.history.retainedApplications
>>>
>>> I know from the bugtracker, that the last one describes the number of
>>> applications the history-server holds in memory.
>>>
>>> Can I set the properties in the spark-env.sh? And where?
>>>
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>>
>>
>> Links:
>> --
>> [1] https://spark.apache.org/docs/latest/configuration.html#spark-ui
>>
>
>
>
>


Re: Where to set properties for the retainedJobs/Stages?

2016-04-01 Thread Max Schmidt
Yes but doc doesn't say any word for which variable the configs are 
valid, so do I have to set them for the history-server? The daemon? The 
workers?


And what if I use the java API instead of spark-submit for the jobs?

I guess that the spark-defaults.conf are obsolete for the java API?


Am 2016-04-01 18:58, schrieb Ted Yu:

You can set them in spark-defaults.conf

See 
also https://spark.apache.org/docs/latest/configuration.html#spark-ui 
[1]


On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt  wrote:


Can somebody tell me the interaction between the properties:

spark.ui.retainedJobs
spark.ui.retainedStages
spark.history.retainedApplications

I know from the bugtracker, that the last one describes the number 
of

applications the history-server holds in memory.

Can I set the properties in the spark-env.sh? And where?


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org




Links:
--
[1] https://spark.apache.org/docs/latest/configuration.html#spark-ui





-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Where to set properties for the retainedJobs/Stages?

2016-04-01 Thread Ted Yu
You can set them in spark-defaults.conf

See also https://spark.apache.org/docs/latest/configuration.html#spark-ui

On Fri, Apr 1, 2016 at 8:26 AM, Max Schmidt  wrote:

> Can somebody tell me the interaction between the properties:
>
> spark.ui.retainedJobs
> spark.ui.retainedStages
> spark.history.retainedApplications
>
> I know from the bugtracker, that the last one describes the number of
> applications the history-server holds in memory.
>
> Can I set the properties in the spark-env.sh? And where?
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Where to set properties for the retainedJobs/Stages?

2016-04-01 Thread Max Schmidt
Can somebody tell me the interaction between the properties:

spark.ui.retainedJobs
spark.ui.retainedStages
spark.history.retainedApplications

I know from the bugtracker, that the last one describes the number of
applications the history-server holds in memory.

Can I set the properties in the spark-env.sh? And where?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org