[ https://issues.apache.org/jira/browse/SPARK-25276?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597027#comment-16597027 ]
Apache Spark commented on SPARK-25276: -------------------------------------- User 'ajithme' has created a pull request for this issue: https://github.com/apache/spark/pull/22277 > Redundant constrains when using alias > ------------------------------------- > > Key: SPARK-25276 > URL: https://issues.apache.org/jira/browse/SPARK-25276 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.1.0, 2.3.1 > Reporter: Ajith S > Priority: Major > Attachments: test.patch > > > Attaching a test to reproduce the issue. The test fails with following message > {color:#FF0000}== FAIL: Constraints do not match ==={color} > {color:#FF0000}Found: isnotnull(z#5),(z#5 > 10),(x#3 > 10),(z#5 <=> x#3),(b#1 > <=> y#4),isnotnull(x#3){color} > {color:#FF0000}Expected: (x#3 > 10),isnotnull(x#3),(b#1 <=> y#4),(z#5 <=> > x#3){color} > {color:#FF0000}== Result =={color} > {color:#FF0000}Missing: N/A{color} > {color:#FF0000}Found but not expected: isnotnull(z#5),(z#5 > 10){color} > Here i think as z has a EqualNullSafe comparison with x, so having > isnotnull(z#5),(z#5 > 10) is redundant. If a query has lot of aliases, this > may cause overhead. > So i suggest > [https://github.com/apache/spark/blob/v2.3.2-rc5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.scala#L254] > instead of addAll++= we must just assign = -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org