Hi ,
To be very specific I am looking for UDFs syntax for example which takes
String as parameter and returns integer .. how do we define the return type
.


Thanks,

Divya

On 21 July 2016 at 00:24, Andy Davidson <a...@santacruzintegration.com>
wrote:

> Hi Divya
>
> In general you will get better performance if you can minimize your use of
> UDFs. Spark 2.0/ tungsten does a lot of code generation. It will have to
> treat your UDF as a block box.
>
> Andy
>
> From: Rishabh Bhardwaj <rbnex...@gmail.com>
> Date: Wednesday, July 20, 2016 at 4:22 AM
> To: Rabin Banerjee <dev.rabin.baner...@gmail.com>
> Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark" <
> user@spark.apache.org>
> Subject: Re: write and call UDF in spark dataframe
>
> Hi Divya,
>
> There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
> Rabin has used that in the sql query,if you want to use it in
> dataframe DSL you can try like this,
>
> val new_df = df.select(from_unixtime($"time").as("newtime"))
>
>
> Thanks,
> Rishabh.
>
> On Wed, Jul 20, 2016 at 4:21 PM, Rabin Banerjee <
> dev.rabin.baner...@gmail.com> wrote:
>
>> Hi Divya ,
>>
>> Try,
>>
>> val df = sqlContext.sql("select from_unixtime(ts,'YYYY-MM-dd') as `ts` from 
>> mr")
>>
>> Regards,
>> Rabin
>>
>> On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot <divya.htco...@gmail.com>
>> wrote:
>>
>>> Hi,
>>> Could somebody share example of writing and calling udf which converts
>>> unix tme stamp to date tiime .
>>>
>>>
>>> Thanks,
>>> Divya
>>>
>>
>>
>

Reply via email to