HyukjinKwon commented on code in PR #43352:
URL: https://github.com/apache/spark/pull/43352#discussion_r1427431997


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/utils/ErrorUtils.scala:
##########
@@ -115,6 +115,17 @@ private[connect] object ErrorUtils extends Logging {
           if (sparkThrowable.getErrorClass != null) {
             sparkThrowableBuilder.setErrorClass(sparkThrowable.getErrorClass)
           }
+          for (queryCtx <- sparkThrowable.getQueryContext) {
+            sparkThrowableBuilder.addQueryContexts(
+              FetchErrorDetailsResponse.QueryContext
+                .newBuilder()
+                .setObjectType(queryCtx.objectType())
+                .setObjectName(queryCtx.objectName())
+                .setStartIndex(queryCtx.startIndex())

Review Comment:
   Alright, there are two issues. TL;DR
   
   1. There are many places that does not provide `Origin` that contains 
`QueryContext` at `QueryCompilationErrors`. So the `QueryContext` is often 
missing (e.g., the case above). cc @MaxGekk 
   
   2. `withOrigin` is not being invoked, and the stacktrace is undefined in 
Spark Connect server. So always we set `SQLQueryContext`. That's why this 
specific code did not throw an exception from 
https://github.com/apache/spark/pull/43334/files#diff-b3bc05fec45cd951053b2876c71c7730b63789cb4336a7537a6654c724db3241R586-R589
 because it has never been `DataFrameContext`. We should invoke `withOrigin` in 
Spark Connect server. cc @heyihong @juliuszsompolski FYI.
   
   3. The current logic in `ErrorUtils.scala` should handle the case of 
`DataFrameQueryContext` e.g., `DataFrameQueryContext.stopIndex()` will throw an 
exception. Should we:
     - Set a default value in `DataFrameQueryContext.stopIndex()` instead of 
throwing an exception? @MaxGekk 
     - Or, make the protobuf message for this optional, and throw an exception 
from Spark Connect client sides? @heyihong @juliuszsompolski 



##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/utils/ErrorUtils.scala:
##########
@@ -115,6 +115,17 @@ private[connect] object ErrorUtils extends Logging {
           if (sparkThrowable.getErrorClass != null) {
             sparkThrowableBuilder.setErrorClass(sparkThrowable.getErrorClass)
           }
+          for (queryCtx <- sparkThrowable.getQueryContext) {
+            sparkThrowableBuilder.addQueryContexts(
+              FetchErrorDetailsResponse.QueryContext
+                .newBuilder()
+                .setObjectType(queryCtx.objectType())
+                .setObjectName(queryCtx.objectName())
+                .setStartIndex(queryCtx.startIndex())

Review Comment:
   Alright, there are two issues. TL;DR
   
   1. There are many places that does not provide `Origin` that contains 
`QueryContext` at `QueryCompilationErrors`. So the `QueryContext` is often 
missing (e.g., the case above). cc @MaxGekk 
   
   2. `withOrigin` is not being invoked, and the stacktrace is undefined in 
Spark Connect server. So always we set `SQLQueryContext`. That's why this 
specific code did not throw an exception from 
https://github.com/apache/spark/pull/43334/files#diff-b3bc05fec45cd951053b2876c71c7730b63789cb4336a7537a6654c724db3241R586-R589
 because it has never been `DataFrameContext`. We should invoke `withOrigin` in 
Spark Connect server. cc @heyihong @juliuszsompolski FYI.
   
   3. The current logic in `ErrorUtils.scala` should handle the case of 
`DataFrameQueryContext` e.g., `DataFrameQueryContext.stopIndex()` will throw an 
exception. Should we:
       - Set a default value in `DataFrameQueryContext.stopIndex()` instead of 
throwing an exception? @MaxGekk 
       - Or, make the protobuf message for this optional, and throw an 
exception from Spark Connect client sides? @heyihong @juliuszsompolski 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to