[ https://issues.apache.org/jira/browse/SPARK-6385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14378836#comment-14378836 ]
Nick Bruun commented on SPARK-6385: ----------------------------------- Strictly speaking, the ISO 8601 standard does not define a fixed precision for decimal fractions of seconds (or minutes or hours for that matter.) Many sources of JSON data will output greater than millisecond precision second decimal fractions (the "validity" of the precision in terms of reasoning is a different matter), so in my opinion, Spark should at least support this (and also shorter notations, where trailing zeros have been trimmed), if not the entire ISO 8601 date/time standard, although that *is* probably erring on the side of pedantic. Alternatively, this could be implemented as a standalone library, but that raises the question of library dependencies in Spark. > ISO 8601 timestamp parsing does not support arbitrary precision second > fractions > -------------------------------------------------------------------------------- > > Key: SPARK-6385 > URL: https://issues.apache.org/jira/browse/SPARK-6385 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.2.1 > Reporter: Nick Bruun > Priority: Minor > > The ISO 8601 timestamp parsing implemented as a resolution to SPARK-4149 does > not support arbitrary precision fractions of seconds, only millisecond > precision. Parsing {{2015-02-02T00:00:07.900GMT-00:00}} will succeed, while > {{2015-02-02T00:00:07.9000GMT-00:00}} will fail. > The issue is caused by the fixed precision of the parsed format in > [DataTypeConversions.scala#L66|https://github.com/apache/spark/blob/84acd08e0886aa23195f35837c15c09aa7804aff/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DataTypeConversions.scala#L66]. > I'm willing to implement a fix, but pointers on the direction would be > appreciated. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org