[ https://issues.apache.org/jira/browse/SPARK-6385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14378816#comment-14378816 ]
Michael Armbrust commented on SPARK-6385: ----------------------------------------- Oh, I see. Looks like this is actually correct, but not what you want http://stackoverflow.com/questions/12000673/string-date-conversion-with-nanoseconds. Is the format you are describing part of the standard? I'm not opposed to us doing something custom (assuming its well tested) if we have to, but I'd like to avoid adding too many non-standard semantics. > ISO 8601 timestamp parsing does not support arbitrary precision second > fractions > -------------------------------------------------------------------------------- > > Key: SPARK-6385 > URL: https://issues.apache.org/jira/browse/SPARK-6385 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.2.1 > Reporter: Nick Bruun > Priority: Minor > > The ISO 8601 timestamp parsing implemented as a resolution to SPARK-4149 does > not support arbitrary precision fractions of seconds, only millisecond > precision. Parsing {{2015-02-02T00:00:07.900GMT-00:00}} will succeed, while > {{2015-02-02T00:00:07.9000GMT-00:00}} will fail. > The issue is caused by the fixed precision of the parsed format in > [DataTypeConversions.scala#L66|https://github.com/apache/spark/blob/84acd08e0886aa23195f35837c15c09aa7804aff/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DataTypeConversions.scala#L66]. > I'm willing to implement a fix, but pointers on the direction would be > appreciated. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org