[ 
https://issues.apache.org/jira/browse/SPARK-33432?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-33432.
-----------------------------------
    Fix Version/s: 3.1.0
       Resolution: Fixed

Issue resolved by pull request 30357
[https://github.com/apache/spark/pull/30357]

> SQL parser should use active SQLConf
> ------------------------------------
>
>                 Key: SPARK-33432
>                 URL: https://issues.apache.org/jira/browse/SPARK-33432
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1
>            Reporter: Lu Lu
>            Assignee: Lu Lu
>            Priority: Major
>             Fix For: 3.1.0
>
>
> In ANSI mode, schema string parsing should fail if the schema uses ANSI 
> reserved keyword as attribute name:
> {code:scala}
> spark.conf.set("spark.sql.ansi.enabled", "true")
> spark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', 
> map('timestampFormat', 'dd/MM/yyyy'));""").show
> output:
> Cannot parse the data type: 
> no viable alternative at input 'time'(line 1, pos 0)
> == SQL ==
> time Timestamp
> ^^^
> {code}
> But this query may accidentally succeed in certain cases cause the DataType 
> parser sticks to the configs of the first created session in the current 
> thread:
> {code:scala}
> DataType.fromDDL("time Timestamp")
> val newSpark = spark.newSession()
> newSpark.conf.set("spark.sql.ansi.enabled", "true")
> newSpark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', 
> map('timestampFormat', 'dd/MM/yyyy'));""").show
> output:
> +--------------------------------+
> |from_json({"time":"26/10/2015"})|
> +--------------------------------+
> |            {2015-10-26 00:00...|
> +--------------------------------+
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to