[ 
https://issues.apache.org/jira/browse/BEAM-4775?focusedWorklogId=201658&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-201658
 ]

ASF GitHub Bot logged work on BEAM-4775:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 20/Feb/19 22:32
            Start Date: 20/Feb/19 22:32
    Worklog Time Spent: 10m 
      Work Description: ajamato commented on pull request #7868: [BEAM-4775] 
MonitoringInfo URN tweaks
URL: https://github.com/apache/beam/pull/7868#discussion_r258706115
 
 

 ##########
 File path: model/fn-execution/src/main/proto/beam_fn_api.proto
 ##########
 @@ -339,14 +339,16 @@ message MonitoringInfoSpecs {
   enum Enum {
     // TODO(ajamato): Add the PTRANSFORM name as a required label after
     // upgrading the python SDK.
-    USER_COUNTER = 0 [(monitoring_info_spec) = {
+    USER_METRIC = 0 [(monitoring_info_spec) = {
 
 Review comment:
   @Ardagan and I have discussed that its not really the best idea to make the 
URN a prefix for the user metric. As this has led to writing a lot of code to 
parse the URN to obtain those fields out. And being this weird exception makes 
all the code have to special case it.
   
   It would be better to package the namespace and name as a label on the 
MonitoringInfo.
   Then we could use the same URN everywhere, no parsing, no special casing.
   
   Given this, then there are two choices i.e.
   (1) Use a different URN for user gauge, counter and distribution
   - "beam:metric:user:counter:v1"
   - "beam:metric:user:gauge:v1"
   - "beam:metric:user:distribution:v1"
   Then change the MonitoringInfoSpec:
         required_labels: [ "PTRANSFORM", "NAMESPACE", "NAME" ],
   and each will have a separate  type_urn in their spec.
         type_urn:
   
   
   (2) Use a single URN "beam:metric:user:v1" for user gauge, counter, 
distribution
   Then change the MonitoringInfoSpec as well:
         required_labels: [ "PTRANSFORM", "NAMESPACE", "NAME" ],
   and we will NOT enforce the type_urn in the spec.
   
   
   I prefer #1, as this will better describe each one, and if we add new user 
metric styles we can add a spec for them
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 201658)
    Time Spent: 18h 10m  (was: 18h)

> JobService should support returning metrics
> -------------------------------------------
>
>                 Key: BEAM-4775
>                 URL: https://issues.apache.org/jira/browse/BEAM-4775
>             Project: Beam
>          Issue Type: Bug
>          Components: beam-model
>            Reporter: Eugene Kirpichov
>            Assignee: Ryan Williams
>            Priority: Major
>              Labels: triaged
>          Time Spent: 18h 10m
>  Remaining Estimate: 0h
>
> [https://github.com/apache/beam/blob/master/model/job-management/src/main/proto/beam_job_api.proto]
>  currently doesn't appear to have a way for JobService to return metrics to a 
> user, even though 
> [https://github.com/apache/beam/blob/master/model/fn-execution/src/main/proto/beam_fn_api.proto]
>  includes support for reporting SDK metrics to the runner harness.
>  
> Metrics are apparently necessary to run any ValidatesRunner tests because 
> PAssert needs to validate that the assertions succeeded. However, this 
> statement should be double-checked: perhaps it's possible to somehow work 
> with PAssert without metrics support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to