RussellSpitzer commented on a change in pull request #3757:
URL: https://github.com/apache/iceberg/pull/3757#discussion_r771737349
##########
File path:
spark/v3.2/spark/src/main/scala/org/apache/spark/sql/execution/datasources/SparkExpressionConverter.scala
##########
@@ -30,4 +34,18 @@ object SparkExpressionConverter {
// But these two conversions already exist and well tested. So, we are
going with this approach.
SparkFilters.convert(DataSourceStrategy.translateFilter(sparkExpression,
supportNestedPredicatePushdown = true).get)
}
+
+ @throws[AnalysisException]
+ def collectResolvedSparkExpression(session: SparkSession, tableName: String,
where: String): Expression = {
Review comment:
I mean Ideally we could get all this into Spark so Spark could do these
conversion for us and just give us filters, but I don't think there are any
ways around this at the moment.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]