kazuyukitanimura commented on code in PR #471:
URL: https://github.com/apache/datafusion-comet/pull/471#discussion_r1620012522


##########
spark/src/main/scala/org/apache/comet/serde/QueryPlanSerde.scala:
##########
@@ -1959,9 +1959,19 @@ object QueryPlanSerde extends Logging with 
ShimQueryPlanSerde with CometExprShim
 
         case UnaryMinus(child, _) =>

Review Comment:
   Hmmm, should we use the second member `failOnError` instead of reading 
`SQLConf.get.ansiEnabled`?
   
   Looks like `EvalMode::Try` is not really used, if I understand correctly.



##########
core/src/execution/datafusion/expressions/negative.rs:
##########
@@ -0,0 +1,304 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+use crate::{errors::CometError, 
execution::datafusion::expressions::cast::EvalMode};
+use arrow::compute::kernels::numeric::neg_wrapping;
+use arrow_array::RecordBatch;
+use arrow_schema::{DataType, Schema};
+use datafusion::{
+    logical_expr::{interval_arithmetic::Interval, ColumnarValue},
+    physical_expr::PhysicalExpr,
+};
+use datafusion_common::{Result, ScalarValue};
+use datafusion_physical_expr::{
+    aggregate::utils::down_cast_any_ref, sort_properties::SortProperties,
+};
+use std::{
+    any::Any,
+    hash::{Hash, Hasher},
+    sync::Arc,
+};
+
+pub fn create_negate_expr(
+    expr: Arc<dyn PhysicalExpr>,
+    eval_mode: EvalMode,
+) -> Result<Arc<dyn PhysicalExpr>, CometError> {
+    Ok(Arc::new(NegativeExpr::new(expr, eval_mode)))
+}
+
+/// Negative expression
+#[derive(Debug, Hash)]
+pub struct NegativeExpr {
+    /// Input expression
+    arg: Arc<dyn PhysicalExpr>,
+    eval_mode: EvalMode,
+}
+
+fn arithmetic_overflow_error(from_type: &str) -> CometError {
+    CometError::ArithmeticOverflow {
+        from_type: from_type.to_string(),
+    }
+}
+
+macro_rules! check_overflow {
+    ($array:expr, $array_type:ty, $min_val:expr, $max_val:expr, 
$type_name:expr) => {{
+        let typed_array = $array
+            .as_any()
+            .downcast_ref::<$array_type>()
+            .expect(concat!(stringify!($array_type), " expected"));
+        for i in 0..typed_array.len() {
+            if typed_array.value(i) == $min_val || typed_array.value(i) == 
$max_val {

Review Comment:
   I think we only care about min case not max.
   E.g. 
   ```
   127i8 => -127i8
   `-(-128i8)` => overflow
   ```



##########
core/src/execution/datafusion/planner.rs:
##########
@@ -566,8 +567,20 @@ impl PhysicalPlanner {
                 Ok(Arc::new(NotExpr::new(child)))
             }
             ExprStruct::Negative(expr) => {
-                let child = self.create_expr(expr.child.as_ref().unwrap(), 
input_schema)?;
-                Ok(Arc::new(NegativeExpr::new(child)))
+                let child: Arc<dyn PhysicalExpr> =
+                    self.create_expr(expr.child.as_ref().unwrap(), 
input_schema.clone())?;
+                let eval_mode = match expr.eval_mode.as_str() {
+                    "ANSI" => EvalMode::Ansi,
+                    "TRY" => EvalMode::Try,
+                    "LEGACY" => EvalMode::Legacy,
+                    other => {
+                        return Err(ExecutionError::GeneralError(format!(
+                            "Invalid EvalMode: \"{other}\""
+                        )))
+                    }
+                };
+                let result = negative::create_negate_expr(child, eval_mode);

Review Comment:
   Wondering if there are different approaches to avoid fully re-implementing 
`NegativeExpr`
   A. implement this in DataFusion
   B. wrap with another operator like 
`NegativeExpr::new(IsEqualToMin::new(child))`



##########
spark/src/test/scala/org/apache/comet/CometExpressionSuite.scala:
##########
@@ -1469,5 +1469,54 @@ class CometExpressionSuite extends CometTestBase with 
AdaptiveSparkPlanHelper {
       }
     }
   }
+  test("unary negative integer overflow test") {
+    def withAnsiMode(enabled: Boolean)(f: => Unit): Unit = {
+      withSQLConf(
+        SQLConf.ANSI_ENABLED.key -> enabled.toString,
+        CometConf.COMET_ANSI_MODE_ENABLED.key -> enabled.toString,
+        CometConf.COMET_ENABLED.key -> "true",
+        CometConf.COMET_EXEC_ENABLED.key -> "true")(f)
+    }
+
+    def checkOverflow(query: String): Unit = {
+      checkSparkMaybeThrows(sql(query)) match {
+        case (Some(sparkException), Some(cometException)) =>
+          assert(sparkException.getMessage.contains("integer overflow"))
+          assert(cometException.getMessage.contains("integer overflow"))
+        case (None, None) =>
+          fail("Exception should be thrown")
+        case (None, Some(_)) =>
+          fail("Comet threw an exception but Spark did not")
+        case (Some(_), None) =>
+          fail("Spark threw an exception but Comet did not")
+      }
+    }
+
+    def runArrayTest(query: String, path: String): Unit = {
+      withParquetTable(path, "t") {
+        withAnsiMode(enabled = false) {
+          checkSparkAnswerAndOperator(sql(query))
+        }
+        withAnsiMode(enabled = true) {
+          checkOverflow(query)
+        }
+      }
+    }
+
+    withTempDir { dir =>
+      // Array values test
+      val arrayPath = new Path(dir.toURI.toString, 
"array_test.parquet").toString
+      Seq(Int.MaxValue, 
Int.MinValue).toDF("a").write.mode("overwrite").parquet(arrayPath)

Review Comment:
   Hmm it may be a good idea to test with more types



##########
core/src/execution/datafusion/expressions/negative.rs:
##########
@@ -0,0 +1,304 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+use crate::{errors::CometError, 
execution::datafusion::expressions::cast::EvalMode};
+use arrow::compute::kernels::numeric::neg_wrapping;
+use arrow_array::RecordBatch;
+use arrow_schema::{DataType, Schema};
+use datafusion::{
+    logical_expr::{interval_arithmetic::Interval, ColumnarValue},
+    physical_expr::PhysicalExpr,
+};
+use datafusion_common::{Result, ScalarValue};
+use datafusion_physical_expr::{
+    aggregate::utils::down_cast_any_ref, sort_properties::SortProperties,
+};
+use std::{
+    any::Any,
+    hash::{Hash, Hasher},
+    sync::Arc,
+};
+
+pub fn create_negate_expr(
+    expr: Arc<dyn PhysicalExpr>,
+    eval_mode: EvalMode,
+) -> Result<Arc<dyn PhysicalExpr>, CometError> {
+    Ok(Arc::new(NegativeExpr::new(expr, eval_mode)))
+}
+
+/// Negative expression
+#[derive(Debug, Hash)]
+pub struct NegativeExpr {
+    /// Input expression
+    arg: Arc<dyn PhysicalExpr>,
+    eval_mode: EvalMode,
+}
+
+fn arithmetic_overflow_error(from_type: &str) -> CometError {
+    CometError::ArithmeticOverflow {
+        from_type: from_type.to_string(),
+    }
+}
+
+macro_rules! check_overflow {
+    ($array:expr, $array_type:ty, $min_val:expr, $max_val:expr, 
$type_name:expr) => {{
+        let typed_array = $array
+            .as_any()
+            .downcast_ref::<$array_type>()
+            .expect(concat!(stringify!($array_type), " expected"));
+        for i in 0..typed_array.len() {
+            if typed_array.value(i) == $min_val || typed_array.value(i) == 
$max_val {
+                return Err(arithmetic_overflow_error($type_name).into());
+            }
+        }
+    }};
+}
+
+impl NegativeExpr {
+    /// Create new not expression
+    pub fn new(arg: Arc<dyn PhysicalExpr>, eval_mode: EvalMode) -> Self {
+        Self { arg, eval_mode }
+    }
+
+    /// Get the input expression
+    pub fn arg(&self) -> &Arc<dyn PhysicalExpr> {
+        &self.arg
+    }
+}
+
+impl std::fmt::Display for NegativeExpr {
+    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
+        write!(f, "(- {})", self.arg)
+    }
+}
+
+impl PhysicalExpr for NegativeExpr {
+    /// Return a reference to Any that can be used for downcasting
+    fn as_any(&self) -> &dyn Any {
+        self
+    }
+
+    fn data_type(&self, input_schema: &Schema) -> Result<DataType> {
+        self.arg.data_type(input_schema)
+    }
+
+    fn nullable(&self, input_schema: &Schema) -> Result<bool> {
+        self.arg.nullable(input_schema)
+    }
+
+    fn evaluate(&self, batch: &RecordBatch) -> Result<ColumnarValue> {
+        let arg = self.arg.evaluate(batch)?;
+        match arg {
+            ColumnarValue::Array(array) => {
+                if self.eval_mode == EvalMode::Ansi {
+                    match array.data_type() {
+                        DataType::Int8 => check_overflow!(
+                            array,
+                            arrow::array::Int8Array,
+                            i8::MIN,
+                            i8::MAX,
+                            "integer"
+                        ),
+                        DataType::Int16 => check_overflow!(
+                            array,
+                            arrow::array::Int16Array,
+                            i16::MIN,
+                            i16::MAX,
+                            "integer"
+                        ),
+                        DataType::Int32 => check_overflow!(
+                            array,
+                            arrow::array::Int32Array,
+                            i32::MIN,
+                            i32::MAX,
+                            "integer"
+                        ),
+                        DataType::Int64 => check_overflow!(
+                            array,
+                            arrow::array::Int64Array,
+                            i64::MIN,
+                            i64::MAX,
+                            "integer"
+                        ),
+                        DataType::Float32 => check_overflow!(
+                            array,
+                            arrow::array::Float32Array,
+                            f32::MIN,
+                            f32::MAX,
+                            "float"
+                        ),
+                        DataType::Float64 => check_overflow!(
+                            array,
+                            arrow::array::Float64Array,
+                            f64::MIN,
+                            f64::MAX,
+                            "float"
+                        ),
+                        DataType::Interval(value) => match value {
+                            arrow::datatypes::IntervalUnit::YearMonth => 
check_overflow!(
+                                array,
+                                arrow::array::IntervalYearMonthArray,
+                                i32::MIN,
+                                i32::MAX,
+                                "interval"
+                            ),
+                            arrow::datatypes::IntervalUnit::DayTime => 
check_overflow!(
+                                array,
+                                arrow::array::IntervalDayTimeArray,
+                                i64::MIN,
+                                i64::MAX,
+                                "interval"
+                            ),
+                            arrow::datatypes::IntervalUnit::MonthDayNano => 
check_overflow!(
+                                array,
+                                arrow::array::IntervalMonthDayNanoArray,
+                                i128::MIN,
+                                i128::MAX,
+                                "interval"
+                            ),
+                        },
+                        _ => unimplemented!(

Review Comment:
   So if other types come in like decimal, this will fail?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to