Sorry, replied to Gerard’s question vs yours.

See here:

Yes, you have to implement your own custom Metrics Source using the Code Hale 
library. See here for some examples: 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as 
the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that 
project anymore unfortunately.

Thanks,
Silvio






On 6/22/15, 9:57 AM, "dgoldenberg" <dgoldenberg...@gmail.com> wrote:

>Hi Gerard,
>
>Have there been any responses? Any insights as to what you ended up doing to
>enable custom metrics? I'm thinking of implementing a custom metrics sink,
>not sure how doable that is yet...
>
>Thanks.
>
>
>
>--
>View this message in context: 
>http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to