[ 
https://issues.apache.org/jira/browse/SPARK-22343?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-22343:
------------------------------------

    Assignee: Apache Spark

> Add support for publishing Spark metrics into Prometheus
> --------------------------------------------------------
>
>                 Key: SPARK-22343
>                 URL: https://issues.apache.org/jira/browse/SPARK-22343
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Janos Matyas
>            Assignee: Apache Spark
>
> I've created a PR (https://github.com/apache-spark-on-k8s/spark/pull/531) to 
> supporting publishing Spark metrics into Prometheus metrics in the 
> https://github.com/apache-spark-on-k8s/spark fork (Spark on Kubernetes). 
> According to the maintainers of the project I should create a ticket here as 
> well, in order to be tracked upstream. See below the original text of the PR: 
> _
> Publishing Spark metrics into Prometheus - as discussed earlier in 
> https://github.com/apache-spark-on-k8s/spark/pull/384. 
> Implemented a metrics sink that publishes Spark metrics into Prometheus via 
> [Prometheus Pushgateway](https://prometheus.io/docs/instrumenting/pushing/). 
> Metrics data published by Spark is based on 
> [Dropwizard](http://metrics.dropwizard.io/). The format of Spark metrics is 
> not supported natively by Prometheus thus these are converted using 
> [DropwizardExports](https://prometheus.io/client_java/io/prometheus/client/dropwizard/DropwizardExports.html)
>  prior pushing metrics to the pushgateway.
> Also the default Prometheus pushgateway client API implementation does not 
> support metrics timestamp thus the client API has been ehanced to enrich 
> metrics  data with timestamp. _



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to