[ 
https://issues.apache.org/jira/browse/SPARK-10610?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15179507#comment-15179507
 ] 

Pete Robbins commented on SPARK-10610:
--------------------------------------

I think the appId is an important piece of information when visualizing the 
metrics along with hostname, executorId etc. I'm writing a sink and reporter to 
push the metrics to Elasticsearch and I include these in the metrics types for 
better correlation. eg

{
    "timestamp": "2016-03-03T15:58:31.903+0000",
    "hostName": "9.20.187.127"
    "applicationId": "app-20160303155742-0005",
    "executorId": "driver",
    "BlockManager_memory_maxMem_MB": 3933
  }

The appId and executorId I extract form the metric name. When the sink is 
instantiated I don't believe I have access to any Utils to obtain the current 
appId and executorId so I'm kind of relying on these being in the metric name 
for the moment.

Is it possible to make appId, applicationName, executorId avaiable to me via 
some Utils function that I have access to in a metrics Sink?

> Using AppName instead of AppId in the name of all metrics
> ---------------------------------------------------------
>
>                 Key: SPARK-10610
>                 URL: https://issues.apache.org/jira/browse/SPARK-10610
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 1.5.0
>            Reporter: Yi Tian
>            Priority: Minor
>
> When we using {{JMX}} to monitor spark system,  We have to configure the name 
> of target metrics in the monitor system. But the current name of metrics is 
> {{appId}} + {{executorId}} + {{source}} . So when the spark program 
> restarted, we have to update the name of metrics in the monitor system.
> We should add an optional configuration property to control whether using the 
> appName instead of appId in spark metrics system.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to