[
https://issues.apache.org/jira/browse/SPARK-42749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17699137#comment-17699137
]
Yuming Wang commented on SPARK-42749:
-------------------------------------
Please enable ansi:
{code:sql}
spark-sql (default)> set spark.sql.ansi.enabled=true;
spark.sql.ansi.enabled true
Time taken: 0.088 seconds, Fetched 1 row(s)
spark-sql (default)> select cast(7.415246799222789E19 as int);
[CAST_OVERFLOW] The value 7.415246799222789E19D of the type "DOUBLE" cannot be
cast to "INT" due to an overflow. Use `try_cast` to tolerate overflow and
return NULL instead. If necessary set "spark.sql.ansi.enabled" to "false" to
bypass this error.
org.apache.spark.SparkArithmeticException: [CAST_OVERFLOW] The value
7.415246799222789E19D of the type "DOUBLE" cannot be cast to "INT" due to an
overflow. Use `try_cast` to tolerate overflow and return NULL instead. If
necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
{code}
> CAST(x as int) does not generate error with overflow
> ----------------------------------------------------
>
> Key: SPARK-42749
> URL: https://issues.apache.org/jira/browse/SPARK-42749
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.2.1, 3.3.0, 3.3.1, 3.3.2
> Environment: It was tested on a DataBricks environment with DBR 10.4
> and above, running Spark v3.2.1 and above.
> Reporter: Tjomme Vergauwen
> Priority: Major
> Attachments: Spark-42749.PNG
>
>
> Hi,
> When performing the following code:
> {{select cast(7.415246799222789E19 as int)}}
> according to the documentation, an error is expected as
> {{7.415246799222789E19 }}is an overflow value for datatype INT.
> However, the value 2147483647 is returned.
> The behaviour of the following is correct as it returns NULL:
> {{select try_cast(7.415246799222789E19 as int) }}
> This results in unexpected behaviour and data corruption.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]