Is the suggestion just to use a different config (and maybe fallback to
appid) in order to publish metrics? Seems reasonable.


On Tue, Mar 1, 2016 at 8:17 AM, Karan Kumar <karankumar1...@gmail.com>
wrote:

> +dev mailing list
>
> Time series analysis on metrics becomes quite useful when running spark
> jobs using a workflow manager like oozie.
>
> Would love to take this up if the community thinks its worthwhile.
>
> On Tue, Feb 23, 2016 at 2:59 PM, Karan Kumar <karankumar1...@gmail.com>
> wrote:
>
>> HI
>>
>> Spark at the moment uses application ID to report metrics. I was thinking
>> that if we can create an option to export metrics on a user-controlled key.
>> This will allow us to do time series analysis on counters by dumping these
>> counters in a DB such as graphite.
>>
>> One of the approaches I had in mind was allowing a user to set a property
>> via the spark client. If that property is set, use the property value to
>> report metrics else use the current implementation
>> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala>of
>> reporting metrics on appid.
>>
>> Thoughts?
>>
>> --
>> Thanks
>> Karan
>>
>
>
>
> --
> Thanks
> Karan
>

Reply via email to