I'm looking at the doc here: https://spark.apache.org/docs/latest/monitoring.html.
Is there a way to define custom metrics in Spark, via Coda Hale perhaps, and emit those? Can a custom metrics sink be defined? And, can such a sink collect some metrics, execute some metrics handling logic, then invoke a callback and notify the Spark consumers that had emitted the metrics that that logic has been executed? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Custom-Spark-metrics-tp23350.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org