dongjoon-hyun commented on pull request #29075:
URL: https://github.com/apache/spark/pull/29075#issuecomment-657275435


   Thank you for updating, @gengliangwang . Shall we adjust this test case name 
accordingly together?
   ```
   test("SPARK-32284: Avoid pushing down too many predicates in partition 
pruning") {
   ```
   
   BTW, in the test case, since `20` looks reasonably a small number in the 
Spark world. Could you use more functional word to describe the change? For 
example, this PR is not limiting based on the number of predicate like `10 is 
possible, but 20 is not allowed`. Apache Spark still will hit the HMS issue 
when we have a long long SQL query with `too many predicates` after this PR. 
So, this PR doesn't fix `Avoid pushing down too many predicates in partition 
pruning`. Instead, this PR looks like mitigating the regression due to the 
previous improvement PR.
   ```
   We should push down the convertible original queries as they are, instead of 
converting all predicates into CNF
   We can skip grouping expressions so that we can stop the CNF conversion when 
the predicates becoming too long.
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to