Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-12-12 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2540345223

   > Hi @zml1206 , do you have time to continue the work now since the nested 
WITH is supported?
   
   @cloud-fan thank you for ping, I briefly tried it yesterday. It is a bit 
tricky to make a complex expression only calculated once. For example, an 
expression can be partially pushed down. The pushed down expression and the 
expression that is not pushed down may have the same common expression. In 
addition, push down The alias used for replacement should also be reused and 
rewritten with the projectList generated by with. I still need to spend some 
time to design this area carefully.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-12-12 Thread via GitHub


cloud-fan commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2540220771

   Hi @zml1206 , do you have time to continue the work now since the nested 
WITH is supported?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-25 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2132040690

   > Can we make `With` nested as well?
   
   I optimized `WITH` nested in #46741 , can you have time take a look? thank 
you. @cloud-fan 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-24 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2129064493

   > Can we make `With` nested as well?
   
   
   I have thought about it for a long time, and I can change the logic of 
withRewrite. The `alias` generated by the lowest layer with is in the uppermost 
project, which is currently reversed. This can solve the complex nesting above.
   `with(with(((commonexpressionref(CommonExpressionId(1,false), IntegerType, 
true) + commonexpressionref(CommonExpressionId(1,false), IntegerType, true)) > 
commonexpressionref(CommonExpressionId(0,false), IntegerType, true)), 
commonexpressiondef((commonexpressionref(CommonExpressionId(0,false), 
IntegerType, true) * commonexpressionref(CommonExpressionId(0,false), 
IntegerType, true)), CommonExpressionId(1,false))), commonexpressiondef((a#0 + 
a#0), CommonExpressionId(0,false)))`
   Do you think it’s feasible? If so, I’ll do it.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-22 Thread via GitHub


cloud-fan commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2124511621

   Can we make `With` nested as well?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-22 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2124277717

   @cloud-fan I tried to push down predicate through `With`, however, found 
that when there is complex nesting, legal `With` cannot be generated, common 
expression exists in both child and defs of `With`. 
   For example
   ```
   SELECT *
   FROM (
SELECT c, c * c AS d
FROM (
SELECT a + a AS c
FROM t1
) t2
GROUP BY c
   ) t3
   WHERE d + d > c + c
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-16 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2116500596

   `with` is a good idea, thank you very much @cloud-fan . Close it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-16 Thread via GitHub


zml1206 closed pull request #46499: [SPARK-48213][SQL] Do not push down 
predicate if non-cheap expression exceed reused limit
URL: https://github.com/apache/spark/pull/46499


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-16 Thread via GitHub


cloud-fan commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2115258124

   I think https://github.com/apache/spark/pull/45802#issuecomment-2101762336 
is a better idea.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



Re: [PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-16 Thread via GitHub


zml1206 commented on PR #46499:
URL: https://github.com/apache/spark/pull/46499#issuecomment-2114700517

   cc @cloud-fan 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[PR] [SPARK-48213][SQL] Do not push down predicate if non-cheap expression exceed reused limit [spark]

2024-05-09 Thread via GitHub


zml1206 opened a new pull request, #46499:
URL: https://github.com/apache/spark/pull/46499

   ### What changes were proposed in this pull request?
   Avoid push down predicate if non-cheap expression exceed reused limit. Push 
down predicate through project/aggregate need replace expression, if the 
expression is non-cheap and reused many times, the cost of repeated 
calculations may be greater than the benefits of pushdown predicates.
   
   
   ### Why are the changes needed?
   Like #33958, to avoid performance regression and larger plans such as case 
when nested, the difference is that push down will have additional benefits, so 
the limit of reused count is `Int.MaxValue` instead of 1.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   
   ### How was this patch tested?
   Unit test.
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org