Zouxxyy commented on code in PR #11153:
URL: 
https://github.com/apache/incubator-gluten/pull/11153#discussion_r2567984675


##########
gluten-substrait/src/main/scala/org/apache/gluten/extension/columnar/PushDownFilterToScan.scala:
##########
@@ -31,35 +31,15 @@ object PushDownFilterToScan extends Rule[SparkPlan] with 
PredicateHelper {
   override def apply(plan: SparkPlan): SparkPlan = plan.transformUp {
     case filter: FilterExecTransformerBase =>
       filter.child match {
-        case fileScan: FileSourceScanExecTransformer =>
-          val pushDownFilters =
-            
BackendsApiManager.getSparkPlanExecApiInstance.postProcessPushDownFilter(
-              splitConjunctivePredicates(filter.cond),
-              fileScan)
-          val newScan = fileScan.copy(dataFilters = pushDownFilters)
+        case scan: BasicScanExecTransformer
+            if 
BackendsApiManager.getSparkPlanExecApiInstance.supportPushDownFilterToScan(
+              scan) && scan.pushDownFilters.isEmpty =>

Review Comment:
   Thanks, updated. It seems more appropriate to let Spark handle the planning 
itself.
   
   Previously, I explicitly set `Some(Seq.empty)` to indicate that a scan 
(e.g., `HiveTableScanExecTransformer` and `MicroBatchScanExecTransformer`) does 
not support filter pushdown. 
   
   Now that this has been removed, I've introduced a `supportPushDownFilters` 
flag to explicitly indicate whether filter pushdown is supported by the Scan.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to