alexeykudinkin commented on code in PR #5943:
URL: https://github.com/apache/hudi/pull/5943#discussion_r926897002


##########
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/DeleteHoodieTableCommand.scala:
##########
@@ -36,9 +37,13 @@ case class DeleteHoodieTableCommand(deleteTable: 
DeleteFromTable) extends Hoodie
 
     // Remove meta fields from the data frame
     var df = removeMetaFields(Dataset.ofRows(sparkSession, table))
-    if (deleteTable.condition.isDefined) {
-      df = df.filter(Column(deleteTable.condition.get))
+    // SPARK-38626 DeleteFromTable.condition is changed from 
Option[Expression] to Expression in Spark 3.3
+    val condition: Expression = deleteTable.condition match {

Review Comment:
   The overarching idea is to limit this compatibility code to live inside 
Spark-specific domains (SparkAdapter, HoodieCatalystExpressionUtils, etc). 
   
   > If modified like that, I think we would still have to handle the type 
matching at L133 HoodieAnalysis.scala
   
   Good call. We can modify then previous suggestion and instead of 
`resolveDeleteFromCommand` we can either do 
`extractCondition(DeleteFromCommand)`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to