comphead commented on issue #2793:
URL: 
https://github.com/apache/datafusion-comet/issues/2793#issuecomment-3561016437

   @andygrove I tested with MakeDecimal directly which is non part of public 
SQL API
   
   ```
   test("make decimal") {
   
       sql("create table t1 using parquet as select 123456 as c1 from range(1)")
   
       withSQLConf(
         CometConf.COMET_EXEC_ENABLED.key -> "true",
         SQLConf.USE_V1_SOURCE_LIST.key -> "parquet",
         CometConf.COMET_ENABLED.key -> "true",
         CometConf.COMET_EXPLAIN_FALLBACK_ENABLED.key -> "true",
         "spark.sql.ansi.enabled" -> "true",
         "spark.sql.adaptive.enabled" -> "false",
         "spark.comet.expression.Sum.allowIncompatible" -> "true",
         CometConf.COMET_NATIVE_SCAN_IMPL.key -> "native_iceberg_compat",
         "spark.sql.optimizer.excludedRules" -> 
"org.apache.spark.sql.catalyst.optimizer.ConstantFolding") {
   
         val df = sql("select * from t1")
         val makeDecimalExpr = MakeDecimal(df.col("c1").expr, 3, 0)
         val makeDecimalColumn = new Column(makeDecimalExpr)
         val df1 = df.withColumn("result", makeDecimalColumn)
         df1.explain("formatted")
   
         checkSparkAnswerAndOperator(df1)
       }
   
     }
   ```
   
   I found that in ANSI mode Spark adds a specific overflow check code on JVM 
side, and the check delegated to JVM entirely.
   So there is no correctness issue, interesting part that native code even not 
called in ANSI mode when overflow happens and error comes from JVM without even 
calling native `make_decimal`. 
   
   @coderfender this maybe also true for ANSI math expressions.
   
   Without ANSI the native called for both overflow and non overflow values. 
   
   Also found a bug related to DataFrame API
   
   ```
   Caused by: org.apache.comet.CometNativeException: primitive array
           at std::backtrace::Backtrace::create(__internal__:0)
           at comet::errors::init::{{closure}}(__internal__:0)
           at std::panicking::panic_with_hook(__internal__:0)
           at std::panicking::panic_handler::{{closure}}(__internal__:0)
           at std::sys::backtrace::__rust_end_short_backtrace(__internal__:0)
           at __rustc::rust_begin_unwind(__internal__:0)
           at core::panicking::panic_fmt(__internal__:0)
           at core::option::expect_failed(__internal__:0)
           at 
datafusion_comet_spark_expr::math_funcs::internal::make_decimal::spark_make_decimal(__internal__:0)
           at 
<datafusion_comet_spark_expr::comet_scalar_funcs::CometScalarFunction as 
datafusion_expr::udf::ScalarUDFImpl>::invoke_with_args(__internal__:0)
           at <datafusion_physical_expr::scalar_function::ScalarFunctionExpr as 
datafusion_physical_expr_common::physical_expr::PhysicalExpr>::evaluate(__internal__:0)
           at <core::iter::adapters::GenericShunt<I,R> as 
core::iter::traits::iterator::Iterator>::next(__internal__:0)
           at <datafusion_physical_plan::projection::ProjectionStream as 
futures_core::stream::Stream>::poll_next(__internal__:0)
           at 
comet::execution::jni_api::Java_org_apache_comet_Native_executePlan::{{closure}}::{{closure}}(__internal__:0)
           at _Java_org_apache_comet_Native_executePlan(__internal__:0)
        at org.apache.comet.Native.executePlan(Native Method)
   ```
    


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to