Hi Gerard,

Yes, you have to implement your own custom Metrics Source using the Code Hale 
library. See here for some examples: 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as 
the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that 
project anymore unfortunately.

Thanks,
Silvio

From: Gerard Maas
Date: Thursday, October 30, 2014 at 4:53 PM
To: user, "d...@spark.apache.org<mailto:d...@spark.apache.org>"
Subject: Registering custom metrics

vHi,

I've been exploring the metrics exposed by Spark and I'm wondering whether 
there's a way to register job-specific metrics that could be exposed through 
the existing metrics system.

Would there be an  example somewhere?

BTW, documentation about how the metrics work could be improved. I found out 
about the default servlet and the metrics/json/ endpoint on the code. I could 
not find any reference to that on the dedicated doc page [1]. Probably 
something I could contribute if there's nobody on that at the moment.

-kr, Gerard.

[1]   http://spark.apache.org/docs/1.1.0/monitoring.html#Metrics

Reply via email to