I'm also pretty interested how to create custom Sinks in Spark. I'm using it
with Ganglia and the normal metrics from JVM source do show up. I tried to
create my own metric based on Issac's code, but does not show up in Ganglia.
Does anyone know where is the problem?
Here's the code snippet: 

class AccumulatorSource(accumulator: Accumulator[Long], name: String)
extends Source {
  
  val sourceName = "accumulator.metrics"
  val metricRegistry = new MetricRegistry()
  
  metricRegistry.register(MetricRegistry.name("accumulator", name), new
Gauge[Long] {
     override def getValue: Long = {
            return accumulator.value;
  }});

}

and then in the main:
val longAccumulator = sc.accumulator[Long](0);
val accumulatorMetrics = new AccumulatorSource(longAccumulator ,
"counters.accumulator");
SparkEnv.get.metricsSystem.registerSource(accumulatorMetrics);




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Executor-metrics-in-spark-application-tp188p10385.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to