Convert the column to a column of java Timestamps. Then you can do the
following....

import java.sql.Timestamp
import java.util.Calendar
def date_trunc(timestamp:Timestamp, timeField:String) = {
  timeField match {
    case "hour" =>
      val cal = Calendar.getInstance()
      cal.setTimeInMillis(timestamp.getTime())
      cal.get(Calendar.HOUR_OF_DAY)

    case "day" =>
      val cal = Calendar.getInstance()
      cal.setTimeInMillis(timestamp.getTime())
      cal.get(Calendar.DAY)
  }
}

sqlContext.udf.register("date_trunc", date_trunc _)

On Wed, Jul 8, 2015 at 9:23 PM, Harish Butani <rhbutani.sp...@gmail.com>
wrote:

> try the spark-datetime package:
> https://github.com/SparklineData/spark-datetime
> Follow this example
> https://github.com/SparklineData/spark-datetime#a-basic-example to get
> the different attributes of a DateTime.
>
> On Wed, Jul 8, 2015 at 9:11 PM, prosp4300 <prosp4...@163.com> wrote:
>
>> As mentioned in Spark sQL programming guide, Spark SQL support Hive UDFs,
>> please take a look below builtin UDFs of Hive, get day of year should be as
>> simply as existing RDBMS
>>
>> https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions
>>
>>
>> At 2015-07-09 12:02:44, "Ravisankar Mani" <rrav...@gmail.com> wrote:
>>
>> Hi everyone,
>>
>> I can't get 'day of year'  when using spark query. Can you help any way
>> to achieve day of year?
>>
>> Regards,
>> Ravi
>>
>>
>>
>>
>

Reply via email to