[ https://issues.apache.org/jira/browse/SPARK-37438?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17447267#comment-17447267 ]
Apache Spark commented on SPARK-37438: -------------------------------------- User 'gengliangwang' has created a pull request for this issue: https://github.com/apache/spark/pull/34681 > ANSI mode: Use store assignment rules for resolving function invocation > ----------------------------------------------------------------------- > > Key: SPARK-37438 > URL: https://issues.apache.org/jira/browse/SPARK-37438 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.3.0 > Reporter: Gengliang Wang > Assignee: Gengliang Wang > Priority: Major > > Under ANSI mode(spark.sql.ansi.enabled=true), the function invocation of > Spark SQL: > - In general, it follows the `Store assignment` rules as storing the input > values as the declared parameter type of the SQL functions > - Special rules apply for string literals and untyped NULL. A NULL can be > promoted to any other type, while a string literal can be promoted to any > simple data type. > {code:sql} > > SET spark.sql.ansi.enabled=true; > -- implicitly cast Int to String type > > SELECT concat('total number: ', 1); > total number: 1 > -- implicitly cast Timestamp to Date type > > select datediff(now(), current_date); > 0 > -- specialrule: implicitly cast String literal to Double type > > SELECT ceil('0.1'); > 1 > -- specialrule: implicitly cast NULL to Date type > > SELECT year(null); > NULL > > CREATE TABLE t(s string); > -- Can't store String column as Numeric types. > > SELECT ceil(s) from t; > Error in query: cannot resolve 'CEIL(spark_catalog.default.t.s)' due to data > type mismatch > -- Can't store String column as Date type. > > select year(s) from t; > Error in query: cannot resolve 'year(spark_catalog.default.t.s)' due to data > type mismatch > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org