[ https://issues.apache.org/jira/browse/SPARK-12218?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15048619#comment-15048619 ]
Irakli Machabeli commented on SPARK-12218: ------------------------------------------ I'm afraid I don't really know what that means, "plan by explain(true)" Shall I type it in repl? [ https://issues.apache.org/jira/browse/SPARK-12218?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15047928#comment-15047928 ] Xiao Li commented on SPARK-12218: --------------------------------- Could you provide the plan by explain(true)? [~imachabeli] Thanks! "(not A) or (not B)" -------------------------------------------------------------------------------------------- PaymentsReceived=0 and ExplicitRoll in ('PreviouslyPaidOff', 'PreviouslyChargedOff'))").count() not(PaymentsReceived=0) or not (ExplicitRoll in ('PreviouslyPaidOff', 'PreviouslyChargedOff')))").count() -- This message was sent by Atlassian JIRA (v6.3.4#6332) > Boolean logic in sql does not work "not (A and B)" is not the same as "(not > A) or (not B)" > -------------------------------------------------------------------------------------------- > > Key: SPARK-12218 > URL: https://issues.apache.org/jira/browse/SPARK-12218 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.2 > Reporter: Irakli Machabeli > Priority: Blocker > > Two identical queries produce different results > In [2]: sqlContext.read.parquet('prp_enh1').where(" LoanID=62231 and not( > PaymentsReceived=0 and ExplicitRoll in ('PreviouslyPaidOff', > 'PreviouslyChargedOff'))").count() > Out[2]: 18 > In [3]: sqlContext.read.parquet('prp_enh1').where(" LoanID=62231 and ( > not(PaymentsReceived=0) or not (ExplicitRoll in ('PreviouslyPaidOff', > 'PreviouslyChargedOff')))").count() > Out[3]: 28 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org