leung-ming commented on code in PR #2069:
URL: https://github.com/apache/datafusion-comet/pull/2069#discussion_r2254913075


##########
native/spark-expr/src/agg_funcs/sum_decimal.rs:
##########
@@ -287,24 +287,16 @@ impl SumDecimalGroupsAccumulator {
         !self.is_empty.get_bit(index) && !self.is_not_null.get_bit(index)
     }
 
+    #[inline]
     fn update_single(&mut self, group_index: usize, value: i128) {
-        if unlikely(self.is_overflow(group_index)) {
-            // This means there's a overflow in decimal, so we will just skip 
the rest
-            // of the computation
-            return;
-        }
-
         self.is_empty.set_bit(group_index, false);
         let (new_sum, is_overflow) = 
self.sum[group_index].overflowing_add(value);
+        self.sum[group_index] = new_sum;
 
-        if is_overflow || !is_valid_decimal_precision(new_sum, self.precision) 
{
+        if unlikely(is_overflow || !is_valid_decimal_precision(new_sum, 
self.precision)) {

Review Comment:
   > I don't think we should use `likely` and `unlikely`. The `hashbrown` crate 
removed these sort of hints after finding they did more harm than good. In 
practice, unless you have PGO we should let the compiler and branch predictor 
do the work. I'll probably open an issue to discuss removing
   
   I only care the aggregate benchmark here. Looking forward to seeing the 
discussion and something like TPC benchmark results with/without `likely` and 
`unlikely`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to