[jira] [Commented] (SPARK-42546) SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()

2023-03-19 Thread Serge Rielau (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17702337#comment-17702337
 ] 

Serge Rielau commented on SPARK-42546:
--

Can't speak for Wenchen, but +1 [~ddavies1] 

> SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()
> --
>
> Key: SPARK-42546
> URL: https://issues.apache.org/jira/browse/SPARK-42546
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.3.2, 3.4.0
>Reporter: Serge Rielau
>Priority: Major
>
> under ANSI mode SPARK-42045 added error conditions insetad of silent 
> overflows for edge cases in round() and bround().
> However it appears this fix works only for the INT data type. Trying it on a 
> e.g. SMALLINT the function still returns wrong results:
> {code:java}
> spark-sql> select round(2147483647, -1);
> [ARITHMETIC_OVERFLOW] Overflow. If necessary set "spark.sql.ansi.enabled" to 
> "false" to bypass this error.{code}
> {code:java}
> spark-sql> select round(127y, -1);
> -126 {code}
>    



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42546) SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()

2023-03-17 Thread Daniel Davies (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17701987#comment-17701987
 ] 

Daniel Davies commented on SPARK-42546:
---

Can I take this?

> SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()
> --
>
> Key: SPARK-42546
> URL: https://issues.apache.org/jira/browse/SPARK-42546
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.3.2, 3.4.0
>Reporter: Serge Rielau
>Priority: Major
>
> under ANSI mode SPARK-42045 added error conditions insetad of silent 
> overflows for edge cases in round() and bround().
> However it appears this fix works only for the INT data type. Trying it on a 
> e.g. SMALLINT the function still returns wrong results:
> {code:java}
> spark-sql> select round(2147483647, -1);
> [ARITHMETIC_OVERFLOW] Overflow. If necessary set "spark.sql.ansi.enabled" to 
> "false" to bypass this error.{code}
> {code:java}
> spark-sql> select round(127y, -1);
> -126 {code}
>    



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42546) SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()

2023-02-23 Thread Wenchen Fan (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17692985#comment-17692985
 ] 

Wenchen Fan commented on SPARK-42546:
-

[~beliefer] do you have time to take a look?

> SPARK-42045 is incomplete in supporting ANSI_MODE fro round() and bround()
> --
>
> Key: SPARK-42546
> URL: https://issues.apache.org/jira/browse/SPARK-42546
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.3.2, 3.4.0
>Reporter: Serge Rielau
>Priority: Major
>
> under ANSI mode SPARK-42045 added error conditions insetad of silent 
> overflows for edge cases in round() and bround().
> However it appears this fix works only for the INT data type. Trying it on a 
> e.g. SMALLINT the function still returns wrong results:
> {code:java}
> spark-sql> select round(2147483647, -1);
> [ARITHMETIC_OVERFLOW] Overflow. If necessary set "spark.sql.ansi.enabled" to 
> "false" to bypass this error.{code}
> {code:java}
> spark-sql> select round(127y, -1);
> -126 {code}
>    



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org