[ 
https://issues.apache.org/jira/browse/SPARK-38117?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zoli updated SPARK-38117:
-------------------------
    Description: 
Setting up prometheus sink in this way:
{code:java}
-c spark.ui.prometheus.enabled=true
-c spark.executor.processTreeMetrics.enabled=true
-c spark.metrics.conf=/spark/conf/metric.properties{code}
{_}*metric.properties:*{_}{{{{}}{}}}
{code:java}
*.sink.prometheusServlet.class=org.apache.spark.metrics.sink.PrometheusServlet
*.sink.prometheusServlet.path=/metrics/prometheus{code}

Result:

Both of these endpoints have some metrics
{code:java}
<driver-ip>:4040/metrics/prometheus 
<driver-ip>:4040/metrics/executors/prometheus{code}

{{But the executor one}} misses metrics under the executor namespace described 
here: 
[https://spark.apache.org/docs/3.1.2/monitoring.html#component-instance--executor]
So everything is missing from {{bytesRead.count}} to {{threadpool.startedTasks}}

There are neither error nor warn level entries in the driver/executor logs.

By changing to ConsoleSink  I can see all the necessary metrics:
{code:java}
*.sink.console.class=org.apache.spark.metrics.sink.ConsoleSink
*.sink.console.period=10
*.sink.console.unit=seconds{code}

{{ }}
{{Something is wrong with the spark-prometheus integration}}

  was:
Setting up prometheus sink in this way:
-c spark.ui.prometheus.enabled=true
-c spark.executor.processTreeMetrics.enabled=true
-c spark.metrics.conf=/spark/conf/metric.properties
{{}}

{_}*metric.properties:*{_}{{{}{}}}
*.sink.prometheusServlet.class=org.apache.spark.metrics.sink.PrometheusServlet
*.sink.prometheusServlet.path=/metrics/prometheus
Result:

Both of these endpoints have some metrics
<driver-ip>:4040/metrics/prometheus        
<driver-ip>:4040/metrics/executors/prometheus
{{But the executor one}} misses metrics under the executor namespace described 
here: 
[https://spark.apache.org/docs/3.1.2/monitoring.html#component-instance--executor]
So everything is missing from {{bytesRead.count}} to {{threadpool.startedTasks}}

There are neither error nor warn level entries in the driver/executor logs.

By changing to ConsoleSink  I can see all the necessary metrics:
*.sink.console.class=org.apache.spark.metrics.sink.ConsoleSink
*.sink.console.period=10
*.sink.console.unit=seconds
{{ }}
{{Something is wrong with the spark-prometheus integration}}


> Executor metrics are missing on prometheus sink
> -----------------------------------------------
>
>                 Key: SPARK-38117
>                 URL: https://issues.apache.org/jira/browse/SPARK-38117
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.2
>         Environment: versions: Spark3.1.2, K8s v19
>            Reporter: zoli
>            Priority: Major
>
> Setting up prometheus sink in this way:
> {code:java}
> -c spark.ui.prometheus.enabled=true
> -c spark.executor.processTreeMetrics.enabled=true
> -c spark.metrics.conf=/spark/conf/metric.properties{code}
> {_}*metric.properties:*{_}{{{{}}{}}}
> {code:java}
> *.sink.prometheusServlet.class=org.apache.spark.metrics.sink.PrometheusServlet
> *.sink.prometheusServlet.path=/metrics/prometheus{code}
> Result:
> Both of these endpoints have some metrics
> {code:java}
> <driver-ip>:4040/metrics/prometheus 
> <driver-ip>:4040/metrics/executors/prometheus{code}
> {{But the executor one}} misses metrics under the executor namespace 
> described here: 
> [https://spark.apache.org/docs/3.1.2/monitoring.html#component-instance--executor]
> So everything is missing from {{bytesRead.count}} to 
> {{threadpool.startedTasks}}
> There are neither error nor warn level entries in the driver/executor logs.
> By changing to ConsoleSink  I can see all the necessary metrics:
> {code:java}
> *.sink.console.class=org.apache.spark.metrics.sink.ConsoleSink
> *.sink.console.period=10
> *.sink.console.unit=seconds{code}
> {{ }}
> {{Something is wrong with the spark-prometheus integration}}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to