aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129875106


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: 
CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I agree it varies from database to database but why do we have different 
validation behavior for regular inserts? I am a bit worried we would have the 
same problem like with V1 sources when the validation was different in SQL and 
DataFrame writes. It felt to me V2 commands should behave in the same way.
   
   It is not a blocker for this PR but I'd be curious to understand our 
long-term plan. Do we plan to switch to runtime checks for regular inserts too?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to