Hello,
I am trying to publish custom metrics using Spark CustomMetric API as
supported since spark 3.2 https://github.com/apache/spark/pull/31476,
https://spark.apache.org/docs/3.2.0/api/java/org/apache/spark/sql/connector/metric/CustomMetric.html
I have created a custom metric implementing `CustomMetic` with default
constructor overriding name and description.
Created a new instance of the created custom metric in the
`supportedCustomMetrics` method of `spark.sql.connector.read.Scan`.
Created a custom task metric implementing `CustomTaskMetric` with the same
name as that of CustomMetric class and initialized this in
`currentMetricsValues` of PartitionReader.
I have static values as of now but when I run the application, I see in the
spark history page the corresponding value to the metric as N/A.
I have added logs in the `aggregateTaskMetrics` and my flow is going into
it. The spark SQLAppStatusListener.aggregateMetrics is loading my class and
calling the `aggregateTaskMetrics` yet I still see N/A in the spark ui page.
Also, I do see the metrics in the spark events log.
Driver log:
```
23/06/23 19:23:53 INFO Spark32CustomMetric: Spark32CustomMetric in
aggregateTaskMetrics start
23/06/23 19:23:53 INFO Spark32CustomMetric: Spark32CustomMetric in
aggregateTaskMetrics sum:1234 end
+-+--+---+---+
| word|word_count| corpus|corpus_date|
+-+--+---+---+
| LVII| 1|sonnets| 0|
| augurs| 1|sonnets| 0|
| dimm'd| 1|sonnets| 0|```
Attaching the Spark UI page screenshot.
Am I missing something? Any help is really appreciated.
Thanks.
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org