heyihong commented on code in PR #42377:
URL: https://github.com/apache/spark/pull/42377#discussion_r1329300029


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/utils/ErrorUtils.scala:
##########
@@ -57,28 +69,105 @@ private[connect] object ErrorUtils extends Logging {
     classes.toSeq
   }
 
-  private def buildStatusFromThrowable(st: Throwable, stackTraceEnabled: 
Boolean): RPCStatus = {
+  private def serializeClasses(t: Throwable): String = {
+    
JsonMethods.compact(JsonMethods.render(allClasses(t.getClass).map(_.getName)))
+  }
+
+  private[connect] val NUM_ERRORS_LIMIT = 5
+
+  // We can get full exception messages and optionally stacktrace by
+  // a separate RPC call if enrichErrorEnabled is true. So imposing a smaller
+  // limit to reduce the probability of hitting the 8KB header limit.
+  private val MAX_MESSAGE_SIZE = 512
+
+  // Convert Throwable to a protobuf message FetchErrorDetailsResponse.
+  // Truncate error messages by default.
+  private[connect] def throwableToFetchErrorDetailsResponse(

Review Comment:
   I will remove it then...
   
   The issue here is that unlike Python or other non-jvm client, they can 
directly fallback to use message and classes in ErrorInfo if the RPC fails. For 
scala client, doing such fallback means losing cause exceptions that may affect 
control flows



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to