Hi spark users and developers,
what do you do if you want the from_unixtime function in spark sql to
return the timezone you want instead of the system timezone?
Best Regards,
Jerry
Look at
to_utc_timestamp
from_utc_timestamp
On Jan 18, 2016 9:39 AM, "Jerry Lam" wrote:
> Hi spark users and developers,
>
> what do you do if you want the from_unixtime function in spark sql to
> return the timezone you want instead of the system timezone?
>
> Best
Thanks Alex:
So you suggested something like:
from_utc_timestamp(to_utc_timestamp(from_unixtime(1389802875),'America/Montreal'),
'America/Los_Angeles')?
This is a lot of conversion :)
Is there a particular reason not to have from_unixtime to take timezone
information?
I think I will make a UDF
If you can find the function in Oracle or Mysql or Postgress which works
better then we can create similar one.
Timezone convertion is tricky because of daylight saving time.
so better to use UTC without dst in database/DW
On Jan 18, 2016 1:24 PM, "Jerry Lam" wrote:
>