[ 
https://issues.apache.org/jira/browse/SPARK-51941?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-51941:
-----------------------------------
    Labels: pull-request-available  (was: )

> CatalystTypeConverters.convertToCatalyst failed to convert BigDecimal between 
> -1.0 and 1.0
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-51941
>                 URL: https://issues.apache.org/jira/browse/SPARK-51941
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 4.0.0, 3.5.5, 4.1.0
>            Reporter: Junqing Li
>            Priority: Major
>              Labels: pull-request-available
>
> In SPARK-20211, we addressed a fix with BigDecimal type conversion 
> exceptions. However, the CatalystTypeConverters.convertToCatalyst method was 
> not updated accordingly, causing users to still encounter exceptions when 
> attempting to convert BigDecimal types. Below is a reproducible example case 
> for this problem.
> {code:java}
> CatalystTypeConverters.convertToCatalyst(BigDecimal("0.01")) {code}
> {code:java}
> Decimal scale (2) cannot be greater than precision (1).
> org.apache.spark.sql.AnalysisException: Decimal scale (2) cannot be greater 
> than precision (1).
>     at 
> org.apache.spark.sql.errors.DataTypeErrors$.decimalCannotGreaterThanPrecisionError(DataTypeErrors.scala:122)
>     at org.apache.spark.sql.types.DecimalType.<init>(DecimalType.scala:46)
>     at 
> org.apache.spark.sql.catalyst.CatalystTypeConverters$.convertToCatalyst(CatalystTypeConverters.scala:578)
>     at 
> org.apache.spark.sql.catalyst.CatalystTypeConvertersSuite.$anonfun$new$18(CatalystTypeConvertersSuite.scala:159)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to