[ 
https://issues.apache.org/jira/browse/SPARK-49113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-49113.
------------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 47611
[https://github.com/apache/spark/pull/47611]

> Remove assert when translating expressions in DataSourceStrategy when
> ---------------------------------------------------------------------
>
>                 Key: SPARK-49113
>                 URL: https://issues.apache.org/jira/browse/SPARK-49113
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 4.0.0
>            Reporter: Milan Stefanovic
>            Assignee: Milan Stefanovic
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>
> In `DataSourceV2Strategy`, when we translate filters, we initialise 
> `V2ExpressionBuilder` and assert that return type of translation is instance 
> of Predicate.
> Given that `V2ExpressionBuilder` does not have compiler gurantees of 
> returning Predicates, this makes it very easy to create bugs, and make 
> regressions.
> E.g. we add a new translation for expression which returns boolean, but 
> forget to inherit it from predicate. Now suddenly queries that have this 
> expression in their filter will start failing - because we will try to push 
> it down and assert will fail.
> Instead of this approach - we should have option to swallow and absorb bugs 
> from V2ExpressionBuilder by returning `None` whenever we encounter expression 
> which is Predicate.
> This way, it is as if we are saying:
> If translated expression is not of type `Predicate` act as if we didn't know 
> how to translate expression for this particular example.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to