[ 
https://issues.apache.org/jira/browse/SPARK-39060?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17532936#comment-17532936
 ] 

Dongjoon Hyun commented on SPARK-39060:
---------------------------------------

I updated the fixed version 3.3.0 to 3.3.1 because this is not in RC1.

> Typo in error messages of decimal overflow
> ------------------------------------------
>
>                 Key: SPARK-39060
>                 URL: https://issues.apache.org/jira/browse/SPARK-39060
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.1
>            Reporter: Vitalii Li
>            Assignee: Vitalii Li
>            Priority: Major
>             Fix For: 3.1.3, 3.0.4, 3.4.0, 3.3.1
>
>
>    org.apache.spark.SparkArithmeticException 
>    Decimal(expanded,10000000000000000000000000000000000000.1,39,1}) cannot be 
> represented as Decimal(38, 1). If necessary set spark.sql.ansi.enabled to 
> false to bypass this error.
>  
> As shown in {{decimalArithmeticOperations.sql.out}}
> Notice the extra {{}}} before ‘cannot’
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to