xkrogen commented on pull request #33116:
URL: https://github.com/apache/spark/pull/33116#issuecomment-880022892


   > I think we can force the `TimerWithMillisecondSnapshots` to receive a 
time-unit parameter to indicate which time-unit it exposes.
   
   I don't think we can force callers to supply a time unit, since (a) we 
should adhere to the `Timer` interface, which already exposes the unitless 
methods, (b) the whole issue here is that these metrics themselves are trying 
to define their unit, and if the caller supplies a unit, then the caller has to 
somehow know that this metric wants to be in milliseconds, which seems to 
defeat the purpose.
   
   If we want to go down a route like this, it's basically what I was proposing 
initially where `YarnShuffleServiceMetrics` does the conversion. The `Timer` 
interface doesn't provide any way to extract unit information, but we could 
either check if the metric name ends in "millis" (and if so, treat the unit as 
millis instead of nanos) or create a custom extension which has a method 
defining the units, like:
   ```
     static class TimerWithUnits extends Timer {
       private final TimeUnit timeUnit;
       TimerWithUnits(TimeUnit timeUnit) {
         this.timeUnit = timeUnit;
       }
       public TimeUnit getTimeUnit() {
         return timeUnit;
       }
     }
   ```
   Then within `YarnShuffleServiceMetrics#collectMetric`:
   ```
         TimeUnit timeUnit;
         if (metric instanceof TimerWithUnits) {
           timeUnit = ((TimerWithUnits) metric).getTimeUnit();
         } else {
           timeUnit = TimeUnit.NANOSECONDS;
         }
         // do some conversion on the snapshot based on the unit above
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to