If you can find the function in Oracle or Mysql or Postgress which works
better then we can create similar one.

Timezone convertion is tricky because of daylight saving time.
so better to use UTC without dst in database/DW
On Jan 18, 2016 1:24 PM, "Jerry Lam" <chiling...@gmail.com> wrote:

> Thanks Alex:
>
> So you suggested something like:
> from_utc_timestamp(to_utc_timestamp(from_unixtime(1389802875),'America/Montreal'),
> 'America/Los_Angeles')?
>
> This is a lot of conversion :)
>
> Is there a particular reason not to have from_unixtime to take timezone
> information?
>
> I think I will make a UDF if this is the only way out of the box.
>
> Thanks!
>
> Jerry
>
> On Mon, Jan 18, 2016 at 2:32 PM, Alexander Pivovarov <apivova...@gmail.com
> > wrote:
>
>> Look at
>> to_utc_timestamp
>>
>> from_utc_timestamp
>> On Jan 18, 2016 9:39 AM, "Jerry Lam" <chiling...@gmail.com> wrote:
>>
>>> Hi spark users and developers,
>>>
>>> what do you do if you want the from_unixtime function in spark sql to
>>> return the timezone you want instead of the system timezone?
>>>
>>> Best Regards,
>>>
>>> Jerry
>>>
>>
>

Reply via email to