Re: [Proposal] Enabling time series analysis on spark metrics

2016-03-03 Thread Karan Kumar
Precisely. Found a JIRA in this regard : SPARK-10610
<https://issues.apache.org/jira/browse/SPARK-10610>

On Wed, Mar 2, 2016 at 3:36 AM, Reynold Xin <r...@databricks.com> wrote:

> Is the suggestion just to use a different config (and maybe fallback to
> appid) in order to publish metrics? Seems reasonable.
>
>
> On Tue, Mar 1, 2016 at 8:17 AM, Karan Kumar <karankumar1...@gmail.com>
> wrote:
>
>> +dev mailing list
>>
>> Time series analysis on metrics becomes quite useful when running spark
>> jobs using a workflow manager like oozie.
>>
>> Would love to take this up if the community thinks its worthwhile.
>>
>> On Tue, Feb 23, 2016 at 2:59 PM, Karan Kumar <karankumar1...@gmail.com>
>> wrote:
>>
>>> HI
>>>
>>> Spark at the moment uses application ID to report metrics. I was
>>> thinking that if we can create an option to export metrics on a
>>> user-controlled key. This will allow us to do time series analysis on
>>> counters by dumping these counters in a DB such as graphite.
>>>
>>> One of the approaches I had in mind was allowing a user to set a
>>> property via the spark client. If that property is set, use the property
>>> value to report metrics else use the current implementation
>>> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala>of
>>> reporting metrics on appid.
>>>
>>> Thoughts?
>>>
>>> --
>>> Thanks
>>> Karan
>>>
>>
>>
>>
>> --
>> Thanks
>> Karan
>>
>
>


-- 
Thanks
Karan


Re: [Proposal] Enabling time series analysis on spark metrics

2016-03-01 Thread Karan Kumar
+dev mailing list

Time series analysis on metrics becomes quite useful when running spark
jobs using a workflow manager like oozie.

Would love to take this up if the community thinks its worthwhile.

On Tue, Feb 23, 2016 at 2:59 PM, Karan Kumar <karankumar1...@gmail.com>
wrote:

> HI
>
> Spark at the moment uses application ID to report metrics. I was thinking
> that if we can create an option to export metrics on a user-controlled key.
> This will allow us to do time series analysis on counters by dumping these
> counters in a DB such as graphite.
>
> One of the approaches I had in mind was allowing a user to set a property
> via the spark client. If that property is set, use the property value to
> report metrics else use the current implementation
> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala>of
> reporting metrics on appid.
>
> Thoughts?
>
> --
> Thanks
> Karan
>



-- 
Thanks
Karan


[Proposal] Enabling time series analysis on spark metrics

2016-02-23 Thread Karan Kumar
HI

Spark at the moment uses application ID to report metrics. I was thinking
that if we can create an option to export metrics on a user-controlled key.
This will allow us to do time series analysis on counters by dumping these
counters in a DB such as graphite.

One of the approaches I had in mind was allowing a user to set a property
via the spark client. If that property is set, use the property value to
report metrics else use the current implementation
of
reporting metrics on appid.

Thoughts?

-- 
Thanks
Karan