[ https://issues.apache.org/jira/browse/SPARK-11760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jean-Baptiste Onofré resolved SPARK-11760. ------------------------------------------ Resolution: Invalid It has already been fixed by: {code} commit 06f1fdba6d1425afddfc1d45a20dbe9bede15e7a Author: Wenchen Fan <wenc...@databricks.com> Date: Mon Nov 16 08:58:40 2015 -0800 [SPARK-11752] [SQL] fix timezone problem for DateTimeUtils.getSeconds code snippet to reproduce it: ``` TimeZone.setDefault(TimeZone.getTimeZone("Asia/Shanghai")) val t = Timestamp.valueOf("1900-06-11 12:14:50.789") val us = fromJavaTimestamp(t) assert(getSeconds(us) === t.getSeconds) ``` it will be good to add a regression test for it, but the reproducing code need to change the default timezone, and even we change it back, the `lazy val defaultTimeZone` in `DataTimeUtils` is fixed. Author: Wenchen Fan <wenc...@databricks.com> Closes #9728 from cloud-fan/seconds. {code} > SQL Catalyst data time test fails > --------------------------------- > > Key: SPARK-11760 > URL: https://issues.apache.org/jira/browse/SPARK-11760 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.0 > Reporter: Jean-Baptiste Onofré > > In the sql/catalyst module, test("hours / minute / seconds") fails on the > third test data: > {code} > - hours / miniute / seconds *** FAILED *** > 29 did not equal 50 (DateTimeUtilsSuite.scala:370) > {code} > Actually, the problem is that it doesn't use the timezone for seconds, so, we > may have to different timestamp comparison. > I will submit a PR to fix that in DateTimeUtils. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org