LucaCanali edited a comment on pull request #31367:
URL: https://github.com/apache/spark/pull/31367#issuecomment-769738766


   I have updated PR with the proposed list of metrics. The list currently 
contains 10 metrics.
   I can see the need to have just a few important metrics.
   However, I can also see that the Python UDF implementation is complex with 
many moving parts, metrics can help with troubleshooting with corner cases too, 
in those circumstances more metrics mean more flexibility and more chances to 
find where the root cause of the problem is.
   
   Another point for discussion is how accurate are the metrics in the current 
implementation? I have run a few tests to check that the values measured make 
sense and are in the ballpark of what expected.  There is room to do more tests 
with some corner cases to understand how the instrumentation keeps up there.
   
   In particular, measuring execution time can be challenging at times, as with 
this we attempt to do it from JVM. I am aiming to do a “good enough to be 
useful” job for the timing metrics, rather than a precise timing. I have put in 
the metrics description some hints of the nuances I found when testing. 
   
   I think the "time spent sending data" can be useful when troubleshooting 
cases where the performance problem is with sending large amounts of data to 
Python. Time spent executing is the key metric to understand the overall 
performance. Number of rows returned and processed are also useful metric, to 
understand how much work has been done and how much still needs to be done when 
monitoring the progress of an active query.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to