[ https://issues.apache.org/jira/browse/SPARK-33677?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Lu Lu updated SPARK-33677: -------------------------- Description: Spark SQL should throw exceptions when pattern string is invalid: {code:sql} SELECT a LIKE 'm%aca' ESCAPE '%' from t; {code} was: In ANSI mode, schema string parsing should fail if the schema uses ANSI reserved keyword as attribute name: {code:scala} spark.conf.set("spark.sql.ansi.enabled", "true") spark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));""").show output: Cannot parse the data type: no viable alternative at input 'time'(line 1, pos 0) == SQL == time Timestamp ^^^ {code} But this query may accidentally succeed in certain cases cause the DataType parser sticks to the configs of the first created session in the current thread: {code:scala} DataType.fromDDL("time Timestamp") val newSpark = spark.newSession() newSpark.conf.set("spark.sql.ansi.enabled", "true") newSpark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));""").show output: +--------------------------------+ |from_json({"time":"26/10/2015"})| +--------------------------------+ | {2015-10-26 00:00...| +--------------------------------+ {code} > LikeSimplification should be skipped if escape is a wildcard character > ---------------------------------------------------------------------- > > Key: SPARK-33677 > URL: https://issues.apache.org/jira/browse/SPARK-33677 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.1.0 > Reporter: Lu Lu > Assignee: Lu Lu > Priority: Major > > Spark SQL should throw exceptions when pattern string is invalid: > {code:sql} > SELECT a LIKE 'm%aca' ESCAPE '%' from t; > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org