Surbhi-Vijay opened a new issue, #11402:
URL: https://github.com/apache/incubator-gluten/issues/11402

   ### Backend
   
   VL (Velox)
   
   ### Bug description
   
   Decimal casting from floating point value is resulting in incorrect value.
   
   ` cast('100.12' as decimal(5, 1))`
   Spark answer: `100.1`
   Gluten answer: `10.0`
   
   Same issue is happening with 
   
   Repro Testcase:
   ```
     test("Incorrect decimal casting") {
       withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
         withTable("dynparttest2") {
           Seq[(Integer, Integer)](
             (1, 1),
             (1, 3),
             (2, 3),
             (3, 3),
             (4, null),
             (5, null)
           ).toDF("key", "value").createOrReplaceTempView("src")
   
           // decimal
           sql("create table dynparttest2 (value int) partitioned by (pdec 
decimal(5, 1))")
           sql(
             """
               |insert into table dynparttest2 partition(pdec)
               | select count(*), cast('100.12' as decimal(5, 1)) as pdec from 
src
             """.stripMargin)
           checkAnswer(
             sql("select * from dynparttest2"),
             Seq(Row(6, new java.math.BigDecimal("100.1"))))
         }
       }
     }
   ```
   
   Test logs
   ```
   Results do not match for query:
   Timezone: 
sun.util.calendar.ZoneInfo[id="America/Los_Angeles",offset=-28800000,dstSavings=3600000,useDaylight=true,transitions=185,lastRule=java.util.SimpleTimeZone[id=America/Los_Angeles,offset=-28800000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=3,startMonth=2,startDay=8,startDayOfWeek=1,startTime=7200000,startTimeMode=0,endMode=3,endMonth=10,endDay=1,endDayOfWeek=1,endTime=7200000,endTimeMode=0]]
   Timezone Env: 
   
   == Parsed Logical Plan ==
   'Project [*]
   +- 'UnresolvedRelation [dynparttest2], [], false
   
   == Analyzed Logical Plan ==
   value: int, pdec: decimal(5,1)
   Project [value#80, pdec#81]
   +- SubqueryAlias spark_catalog.default.dynparttest2
      +- Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet
   
   == Optimized Logical Plan ==
   Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet
   
   == Physical Plan ==
   VeloxColumnarToRow
   +- ^(1) FileFileSourceScanExecTransformer parquet 
spark_catalog.default.dynparttest2[value#80,pdec#81] Batched: true, 
DataFilters: [], Format: Parquet, Location: CatalogFileIndex(1 
paths)[...../incubator-gluten/spark-warehouse/org.apa..., PartitionFilters: [], 
PushedFilters: [], ReadSchema: struct<value:int> NativeFilters: []
   
   == Results ==
   
   == Results ==
   !== Correct Answer - 1 ==   == Gluten Answer - 1 ==
    struct<>                   struct<>
   ![6,100.1]                  [6,10.0]
   ```
   
   ### Gluten version
   
   main branch
   
   ### Spark version
   
   spark-4.0.x
   
   ### Spark configurations
   
   _No response_
   
   ### System information
   
   _No response_
   
   ### Relevant logs
   
   ```bash
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to