clownxc commented on code in PR #40707:
URL: https://github.com/apache/spark/pull/40707#discussion_r1162834923


##########
core/src/main/scala/org/apache/spark/SparkException.scala:
##########
@@ -355,3 +355,24 @@ private[spark] class SparkSQLFeatureNotSupportedException(
 
   override def getErrorClass: String = errorClass
 }
+
+/**
+ * User error exception thrown from Spark with an error class.
+ */
+private[spark] class SparkUserException(

Review Comment:
   > Question: Are we sure a custom exception is needed for this case? Is there 
any existing exception we can reuse with NPE as cause?
   > 
   > If we want to have a brand new exception, what about 
`SparkNotNullConstraintViolationException` to be more specific? I guess it will 
depend whether we want to skip retries only for this exception type as opposed 
to all Spark exceptions with known error codes.
   
   My understanding is that we want to skip retry logic of user-triggered 
error, not only NPE.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to