[jira] [Commented] (SPARK-25025) Remove the default value of isAll in INTERSECT/EXCEPT
[ https://issues.apache.org/jira/browse/SPARK-25025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16577623#comment-16577623 ] Apache Spark commented on SPARK-25025: -- User 'srowen' has created a pull request for this issue: https://github.com/apache/spark/pull/22084 > Remove the default value of isAll in INTERSECT/EXCEPT > - > > Key: SPARK-25025 > URL: https://issues.apache.org/jira/browse/SPARK-25025 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 2.4.0 >Reporter: Xiao Li >Assignee: Dilip Biswal >Priority: Major > Fix For: 2.4.0 > > > Having the default value of isAll in the logical plan nodes INTERSECT/EXCEPT > could introduce bugs when the callers are not aware of it. Let us get rid of > the default value. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-25025) Remove the default value of isAll in INTERSECT/EXCEPT
[ https://issues.apache.org/jira/browse/SPARK-25025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16569307#comment-16569307 ] Apache Spark commented on SPARK-25025: -- User 'dilipbiswal' has created a pull request for this issue: https://github.com/apache/spark/pull/22000 > Remove the default value of isAll in INTERSECT/EXCEPT > - > > Key: SPARK-25025 > URL: https://issues.apache.org/jira/browse/SPARK-25025 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 2.4.0 >Reporter: Xiao Li >Assignee: Dilip Biswal >Priority: Major > > Having the default value of isAll in the logical plan nodes INTERSECT/EXCEPT > could introduce bugs when the callers are not aware of it. Let us get rid of > the default value. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org