Github user jose-torres commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21721#discussion_r200537454
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/streaming/progress.scala ---
    @@ -178,12 +180,18 @@ class SourceProgress protected[sql](
           if (value.isNaN || value.isInfinity) JNothing else JDouble(value)
         }
     
    -    ("description" -> JString(description)) ~
    +    val jsonVal = ("description" -> JString(description)) ~
           ("startOffset" -> tryParse(startOffset)) ~
           ("endOffset" -> tryParse(endOffset)) ~
           ("numInputRows" -> JInt(numInputRows)) ~
           ("inputRowsPerSecond" -> safeDoubleToJValue(inputRowsPerSecond)) ~
           ("processedRowsPerSecond" -> 
safeDoubleToJValue(processedRowsPerSecond))
    +
    +    if (customMetrics != null) {
    +      jsonVal ~ ("customMetrics" -> tryParse(customMetrics.json()))
    --- End diff --
    
    Is there any way to get an error to the user if their custom metrics fail 
to parse? I'm not entirely sure that's the right thing to do, but I worry that 
it'll be hard to develop against this API if we just silently drop malformed 
metrics.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to