I dont know whats wrong but I can suggest looking up the source of the UDF and debugging from there. I would think this is some JDK API cleveat and not a Spark bug
-- Jan Sterba https://twitter.com/honzasterba | http://flickr.com/honzasterba | http://500px.com/honzasterba On Fri, Mar 4, 2016 at 6:08 PM, Michal Vince <vince.mic...@gmail.com> wrote: > Hi guys > I`m using spark 1.6.0 and I`m not sure if I found a bug or I`m doing > something wrong > > I`m playing with dataframes and I`m converting iso 8601 with millis to my > timezone - which is Europe/Bratislava with fromt_utc_timestamp function > from spark.sql.functions > > the problem is that Europe/Bratislava is UTC+1 hour in february, and from > 27th March it`s going to be UTC+2h but from_utc_timestamp ignores this and > always converts to UTC+2h > > am doing something wrong when converting? > > > e.g. > myDF.withColumn("time", > from_utc_timestamp(l.col("timestamp"),"Europe/Bratislava")) > > and the output: > time timestamp > > > 2016-02-22 02:59:11.0 2016-02-22T00:59:11.000Z > 2016-02-20 20:16:35.0 2016-02-20T18:16:35.000Z > 2016-02-20 05:17:29.0 2016-02-20T03:17:29.000Z > 2016-02-18 18:29:06.0 2016-02-18T16:29:06.000Z > 2016-02-17 01:47:20.0 2016-02-16T23:47:20.000Z > 2016-02-15 07:05:04.0 2016-02-15T05:05:04.000Z > 2016-02-13 23:25:14.0 2016-02-13T21:25:14.000Z >