A couple of lines to include a build dependency and import a library vs.
all of the time to develop and maintain our own time-and-date code or all
of the user headache of having to work-around our choice the link in a
library that doesn't fit their particular needs.

Until there is an obvious, stable and expected-in-almost-all-cases
third-party time-and-date library to chose, I strongly urge that we do not
bind Spark to a particular time-and-date library.  (And there are a lot
better things that we could be doing with our time than developing on our
own yet another time-and-date implementation.)


On Wed, Sep 4, 2013 at 7:45 AM, Gary Malouf <[email protected]> wrote:

> More setup that a user needs to do to reach his functional goals.
>
>
> On Wed, Sep 4, 2013 at 9:40 AM, Mark Hamstra <[email protected]>wrote:
>
>> Why?  What is wrong with using the extant libraries?
>>
>>
>>
>> On Wed, Sep 4, 2013 at 6:37 AM, Gary Malouf <[email protected]>wrote:
>>
>>> Are there any built-in functions for timezone conversions?  I can
>>> obviously bring in NScalaTime and other external libraries. However, being
>>> that this is probably a common need across companies I feel like it would
>>> make more sense to provide this out of the box.
>>>
>>
>>
>

Reply via email to