[ 
https://issues.apache.org/jira/browse/SPARK-6385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14378801#comment-14378801
 ] 

Nick Bruun commented on SPARK-6385:
-----------------------------------

An extra {{S}} does not seem to do the trick, as the resulting date ({{res2}}) 
is incorrect ({{:16}} rather than {{:07}}.) I've looked through a series of 
libraries, and all seem to be doing it in the same way ({{SSS}} and that's it), 
so I'm considering writing a proper parser instead. What is the position on 
having this level of complexity in Spark?

> ISO 8601 timestamp parsing does not support arbitrary precision second 
> fractions
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-6385
>                 URL: https://issues.apache.org/jira/browse/SPARK-6385
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.1
>            Reporter: Nick Bruun
>            Priority: Minor
>
> The ISO 8601 timestamp parsing implemented as a resolution to SPARK-4149 does 
> not support arbitrary precision fractions of seconds, only millisecond 
> precision. Parsing {{2015-02-02T00:00:07.900GMT-00:00}} will succeed, while 
> {{2015-02-02T00:00:07.9000GMT-00:00}} will fail.
> The issue is caused by the fixed precision of the parsed format in 
> [DataTypeConversions.scala#L66|https://github.com/apache/spark/blob/84acd08e0886aa23195f35837c15c09aa7804aff/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DataTypeConversions.scala#L66].
>  I'm willing to implement a fix, but pointers on the direction would be 
> appreciated.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to