Divya:
https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html
The link gives a complete example of registering a udAf - user defined
aggregate function. This is a complete example and this example should give you
a
On Thu, Jul 21, 2016 at 5:53 AM, Mich Talebzadeh
wrote:
> something similar
Is this going to be in Scala?
> def ChangeToDate (word : String) : Date = {
> //return
> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(word,"dd/MM/"),"-MM-dd"))
> val d1 =
On Thu, Jul 21, 2016 at 4:53 AM, Divya Gehlot wrote:
> To be very specific I am looking for UDFs syntax for example which takes
> String as parameter and returns integer .. how do we define the return type
val f: String => Int = ???
val myUDF = udf(f)
or
val myUDF =
On Wed, Jul 20, 2016 at 1:22 PM, Rishabh Bhardwaj wrote:
> val new_df = df.select(from_unixtime($"time").as("newtime"))
or better yet using tick (less typing and more prose than code :))
df.select(from_unixtime('time) as "newtime")
Jacek
block box.
>>
>> Andy
>>
>> From: Rishabh Bhardwaj <rbnex...@gmail.com>
>> Date: Wednesday, July 20, 2016 at 4:22 AM
>> To: Rabin Banerjee <dev.rabin.baner...@gmail.com>
>> Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark&
2 AM
> To: Rabin Banerjee <dev.rabin.baner...@gmail.com>
> Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark" <
> user@spark.apache.org>
> Subject: Re: write and call UDF in spark dataframe
>
> Hi Divya,
>
> There is already "from_unixtime&qu
Rabin Banerjee <dev.rabin.baner...@gmail.com>
Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark"
<user@spark.apache.org>
Subject: Re: write and call UDF in spark dataframe
> Hi Divya,
>
> There is already "from_unixtime" exists in org.apache.s
yep something in line of
val df = sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') as time ")
Note that this does not require a column from an already existing table.
HTH
Dr Mich Talebzadeh
LinkedIn *
Hi Divya,
There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
Rabin has used that in the sql query,if you want to use it in dataframe DSL
you can try like this,
val new_df = df.select(from_unixtime($"time").as("newtime"))
Thanks,
Rishabh.
On Wed, Jul 20, 2016 at 4:21
Hi Divya ,
Try,
val df = sqlContext.sql("select from_unixtime(ts,'-MM-dd') as `ts` from mr")
Regards,
Rabin
On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot
wrote:
> Hi,
> Could somebody share example of writing and calling udf which converts
> unix tme stamp to
Hi,
Could somebody share example of writing and calling udf which converts unix
tme stamp to date tiime .
Thanks,
Divya
11 matches
Mail list logo