Github user dbtsai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21852#discussion_r206695251
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala
 ---
    @@ -416,6 +416,21 @@ object SimplifyConditionals extends Rule[LogicalPlan] 
with PredicateHelper {
             // these branches can be pruned away
             val (h, t) = branches.span(_._1 != TrueLiteral)
             CaseWhen( h :+ t.head, None)
    +
    +      case e @ CaseWhen(branches, Some(elseValue)) if branches
    +        .forall(_._2.semanticEquals(elseValue)) =>
    +        // For non-deterministic conditions with side effect, we can not 
remove it, or change
    +        // the ordering. As a result, we try to remove the deterministic 
conditions from the tail.
    --- End diff --
    
    Should be 
    
    ```scala
      hitNonDetermin = !branches(i - 1).deterministic
      if (!hitNonDetermin) {
        i -= 1
      }
    ```
    
    Personally, I like functional style more, but it's more efficient to use 
Java style here. I updated as you suggested. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to