how to covert millisecond time to SQL timeStamp

2016-02-01 Thread Andy Davidson
What little I know about working with timestamps is based on https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-da tetimestring-handling-time-intervals-and-udafs.html Using the example of dates formatted into human friend strings -> timeStamps I was able to figure out how

Re: how to covert millisecond time to SQL timeStamp

2016-02-01 Thread Ted Yu
See related thread on using Joda DateTime: http://search-hadoop.com/m/q3RTtSfi342nveex1=RE+NPE+ when+using+Joda+DateTime On Mon, Feb 1, 2016 at 7:44 PM, Kevin Mellott wrote: > I've had pretty good success using Joda-Time >

Re: how to covert millisecond time to SQL timeStamp

2016-02-01 Thread VISHNU SUBRAMANIAN
HI , If you need a data frame specific solution , you can try the below df.select(from_unixtime(col("max(utcTimestamp)")/1000)) On Tue, 2 Feb 2016 at 09:44 Ted Yu wrote: > See related thread on using Joda DateTime: > http://search-hadoop.com/m/q3RTtSfi342nveex1=RE+NPE+ >

Re: how to covert millisecond time to SQL timeStamp

2016-02-01 Thread Kevin Mellott
I've had pretty good success using Joda-Time for date/time manipulations within Spark applications. You may be able to use the *DateTIme* constructor below, if you are starting with milliseconds. DateTime public DateTime(long instant) Constructs an