[ https://issues.apache.org/jira/browse/SPARK-36076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17379954#comment-17379954 ]
Apache Spark commented on SPARK-36076: -------------------------------------- User 'dgd-contributor' has created a pull request for this issue: https://github.com/apache/spark/pull/33325 > [SQL] ArrayIndexOutOfBounds in CAST string to timestamp > ------------------------------------------------------- > > Key: SPARK-36076 > URL: https://issues.apache.org/jira/browse/SPARK-36076 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.1.1 > Reporter: Andy Grove > Assignee: dgd_contributor > Priority: Major > > I discovered this bug during some fuzz testing. > {code:java} > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 3.1.1 > /_/ > > Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_282) > Type in expressions to have them evaluated. > Type :help for more information.scala> > scala> import org.apache.spark.sql.types.DataTypes > scala> val df = Seq(":8:434421+ 98:38").toDF("c0") > df: org.apache.spark.sql.DataFrame = [c0: string] > scala> val df2 = df.withColumn("c1", col("c0").cast(DataTypes.TimestampType)) > df2: org.apache.spark.sql.DataFrame = [c0: string, c1: timestamp] > scala> df2.show > java.lang.ArrayIndexOutOfBoundsException: 9 > at > org.apache.spark.sql.catalyst.util.DateTimeUtils$.stringToTimestamp(DateTimeUtils.scala:328) > at > org.apache.spark.sql.catalyst.expressions.CastBase.$anonfun$castToTimestamp$2(Cast.scala:455) > at > org.apache.spark.sql.catalyst.expressions.CastBase.buildCast(Cast.scala:295) > at > org.apache.spark.sql.catalyst.expressions.CastBase.$anonfun$castToTimestamp$1(Cast.scala:451) > at > org.apache.spark.sql.catalyst.expressions.CastBase.nullSafeEval(Cast.scala:840) > at > org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:476) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org