[jira] [Updated] (SPARK-35030) ANSI SQL compliance

2021-04-12 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-35030?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-35030:
---
Description: 
Build an ANSI compliant dialect in Spark, for better data quality and easier 
migration from traditional DBMS to Spark. For example, Spark will throw an 
exception at runtime instead of returning null results when the inputs to a SQL 
operator/function are invalid. 

The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:
{code:java}
-- `spark.sql.ansi.enabled=true`
SELECT 2147483647 + 1;
java.lang.ArithmeticException: integer overflow

-- `spark.sql.ansi.enabled=false`
SELECT 2147483647 + 1;
++
|(2147483647 + 1)|
++
| -2147483648|
++
{code}

Full details of this dialect are documented in 
[https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html|https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html].

Note that some ANSI dialect features maybe not from the ANSI SQL standard 
directly, but their behaviors align with ANSI SQL's style.

  was:
Build an ANSI compliant dialect in Spark, compared to the default Hive 
compliant dialect. For example, Spark will throw an exception at runtime 
instead of returning null results when the inputs to a SQL operator/function 
are invalid. 

The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:
{code:java}
-- `spark.sql.ansi.enabled=true`
SELECT 2147483647 + 1;
java.lang.ArithmeticException: integer overflow

-- `spark.sql.ansi.enabled=false`
SELECT 2147483647 + 1;
++
|(2147483647 + 1)|
++
| -2147483648|
++
{code}

Full details of this dialect are documented in 
[https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html|https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html].

Note that some ANSI dialect features maybe not from the ANSI SQL standard 
directly, but their behaviors align with ANSI SQL's style.


> ANSI SQL compliance
> ---
>
> Key: SPARK-35030
> URL: https://issues.apache.org/jira/browse/SPARK-35030
> Project: Spark
>  Issue Type: Epic
>  Components: SQL
>Affects Versions: 3.0.0, 3.2.0, 3.1.1
>Reporter: Gengliang Wang
>Priority: Major
>
> Build an ANSI compliant dialect in Spark, for better data quality and easier 
> migration from traditional DBMS to Spark. For example, Spark will throw an 
> exception at runtime instead of returning null results when the inputs to a 
> SQL operator/function are invalid. 
> The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:
> {code:java}
> -- `spark.sql.ansi.enabled=true`
> SELECT 2147483647 + 1;
> java.lang.ArithmeticException: integer overflow
> -- `spark.sql.ansi.enabled=false`
> SELECT 2147483647 + 1;
> ++
> |(2147483647 + 1)|
> ++
> | -2147483648|
> ++
> {code}
> Full details of this dialect are documented in 
> [https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html|https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html].
> Note that some ANSI dialect features maybe not from the ANSI SQL standard 
> directly, but their behaviors align with ANSI SQL's style.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-35030) ANSI SQL compliance

2021-04-12 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-35030?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-35030:
---
Summary: ANSI SQL compliance  (was: ANSI SQL compliance dialect for better 
data quality and easier migration from traditional DBMS to Spark )

> ANSI SQL compliance
> ---
>
> Key: SPARK-35030
> URL: https://issues.apache.org/jira/browse/SPARK-35030
> Project: Spark
>  Issue Type: Epic
>  Components: SQL
>Affects Versions: 3.0.0, 3.2.0, 3.1.1
>Reporter: Gengliang Wang
>Priority: Major
>
> Build an ANSI compliant dialect in Spark, compared to the default Hive 
> compliant dialect. For example, Spark will throw an exception at runtime 
> instead of returning null results when the inputs to a SQL operator/function 
> are invalid. 
> The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:
> {code:java}
> -- `spark.sql.ansi.enabled=true`
> SELECT 2147483647 + 1;
> java.lang.ArithmeticException: integer overflow
> -- `spark.sql.ansi.enabled=false`
> SELECT 2147483647 + 1;
> ++
> |(2147483647 + 1)|
> ++
> | -2147483648|
> ++
> {code}
> Full details of this dialect are documented in 
> [https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html|https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html].
> Note that some ANSI dialect features maybe not from the ANSI SQL standard 
> directly, but their behaviors align with ANSI SQL's style.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org