HyukjinKwon commented on pull request #28661:
URL: https://github.com/apache/spark/pull/28661#issuecomment-635404514


   I actually didn't quite care about it but realised that people actually 
pretty hate the JVM stacktrace in Python exceptions. Maybe it's because you 
(and I .. and most of people in Spark dev ..) are used to Java side.
   
   It reminds me of Holden's talk: ["Debugging PySpark—Or Why is There a JVM 
Stack Trace in My 
Python?"](https://databricks.com/session/debugging-pyspark-or-why-is-there-a-jvm-stack-trace-in-my-python),
 could be one of the examples.
   
   I also think I should have added some more context in the PR description. 
This PR:
     - Fixes the whitelisted exceptions `AnalysisException` which usually gives 
a reasonable exception message.
     - Handles and adds the exceptions from Python UDFs to the whitelisted 
exceptions.
   
   If somewhat arbitrary exceptions like a runtime exception, say, from a 
shuffle or user-defined exceptions happen, there will be no behaviour changes.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to