Hyukjin Kwon created SPARK-14787:
------------------------------------

             Summary: Upgrade Joda-Time library from 2.9 to 2.9.3
                 Key: SPARK-14787
                 URL: https://issues.apache.org/jira/browse/SPARK-14787
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.0.0
            Reporter: Hyukjin Kwon
            Priority: Trivial


Currently, Spark uses {{joda-time}} library version 2.9.

This might have to be upgraded to 2.9.3 because it fixes some minor bugs as 
below:

{quote}
Changes in 2.9.3
----------------
 - DateTimeZone data updated to version 2016c

 - Make DateTimeUtils.SYSTEM_MILLIS_PROVIDER public [#357]

 - Fix bug when adding months at the maximum limits of integer [#361]

 - Add Turkish period translations [#364]


Changes in 2.9.2
----------------
 - DateTimeZone data updated to version 2016a (version 2.9 had time-zone data 
2015g)

 - Fix bug in time-zone binary search [#332]

 - Minor fixes to code internals [#339, #326, #344, #350, #343]
 
 - Better document behaviour [#325]


Changes in 2.9.1
----------------
- Fix bug introduced by Long.MIN_VALUE and Long.MAX_VALUE changes [#328]
{quote}

For Spark, there can be some issues by bugs below:

1. Fix bug introduced by Long.MIN_VALUE and Long.MAX_VALUE changes 

{code}
DateTime minDT = new DateTime(Long.MIN_VALUE);
{code}

2.  Fix bug in time-zone binary search

It looks there is a bug for parsing time-zone. It looks a binary search is 
introduced for parsing time-zone but there is a bug. So, If Spark tries to 
parse some timezones, this could be possibly an issue.

3.Fix bug when adding months at the maximum limits of integer

It looks it throws an exception, {{ArrayIndexOutOfBoundsException}}, the codes 
below:

{code}
DateTime dateTime = new DateTime(1455993437373l);
DateTime date = dateTime.plusMonths(2147483647);
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to