[ https://issues.apache.org/jira/browse/SPARK-38534?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17505219#comment-17505219 ]
Apache Spark commented on SPARK-38534: -------------------------------------- User 'dongjoon-hyun' has created a pull request for this issue: https://github.com/apache/spark/pull/35825 > Flaky Test: ansi/datetime-parsing-invalid.sql > --------------------------------------------- > > Key: SPARK-38534 > URL: https://issues.apache.org/jira/browse/SPARK-38534 > Project: Spark > Issue Type: Bug > Components: SQL, Tests > Affects Versions: 3.3.0 > Reporter: Dongjoon Hyun > Priority: Major > > **Java 8** > {code} > $ bin/spark-shell --conf spark.sql.ansi.enabled=true > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > 22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > Spark context Web UI available at http://172.16.0.31:4040 > Spark context available as 'sc' (master = local[*], app id = > local-1647075572229). > Spark session available as 'spark'. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT > /_/ > Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322) > Type in expressions to have them evaluated. > Type :help for more information. > scala> sql("select to_timestamp('366', 'DD')").show > java.time.format.DateTimeParseException: Text '366' could not be parsed, > unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to > false to bypass this error. > {code} > **Java 11+** > {code} > $ bin/spark-shell --conf spark.sql.ansi.enabled=true > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > 22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > Spark context Web UI available at http://172.16.0.31:4040 > Spark context available as 'sc' (master = local[*], app id = > local-1647075607932). > Spark session available as 'spark'. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT > /_/ > Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12) > Type in expressions to have them evaluated. > Type :help for more information. > scala> sql("select to_timestamp('366', 'DD')").show > java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not a > leap year. If necessary set spark.sql.ansi.enabled to false to bypass this > error. > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org