[ https://issues.apache.org/jira/browse/SPARK-42399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17688598#comment-17688598 ]
Narek Karapetian commented on SPARK-42399: ------------------------------------------ Why do we need to throw an exception in ANSI mode, is it described somewhere in SQL standards? What do you think if such a case will be considered as a valid scenario and it will give a correct result? For example, such a query: {code:java} spark-sql> SELECT CONV(SUBSTRING('0xffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff', 3), 16, 10); {code} will be evaluated to: {code:java} 115792089237316195423570985008687907853269984665640564039457584007913129639935 {code} It could be implemented if we use BigInt, instead of `NumberConverter.convert(...)` which uses Long as a data type. > CONV() silently overflows returning wrong results > ------------------------------------------------- > > Key: SPARK-42399 > URL: https://issues.apache.org/jira/browse/SPARK-42399 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.4.0 > Reporter: Serge Rielau > Priority: Critical > > spark-sql> SELECT > CONV(SUBSTRING('0xffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff', > 3), 16, 10); > 18446744073709551615 > Time taken: 2.114 seconds, Fetched 1 row(s) > spark-sql> set spark.sql.ansi.enabled = true; > spark.sql.ansi.enabled true > Time taken: 0.068 seconds, Fetched 1 row(s) > spark-sql> SELECT > CONV(SUBSTRING('0xffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff', > 3), 16, 10); > 18446744073709551615 > Time taken: 0.05 seconds, Fetched 1 row(s) > In ANSI mode we should raise an error for sure. > In non ANSI either an error or a NULL maybe be acceptable. > Alternatively, of course, we could consider if we can support arbitrary > domains since the result is a STRING again. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org