Please help me out !!!!Getting error while trying to hive java generic udf in spark

2017-01-17 Thread Sirisha Cheruvu
Hi Everyone.. getting below error while running hive java udf from sql context.. org.apache.spark.sql.AnalysisException: No handler for Hive udf class com.nexr.platform.hive.udf.GenericUDFNVL2 because: com.nexr.platform.hive.udf.GenericUDFNVL2.; line 1 pos 26 at

Re: Calling udf in Spark

2016-09-08 Thread Deepak Sharma
No its not required for UDF. Its required when you convert from rdd to df. Thanks Deepak On 8 Sep 2016 2:25 pm, "Divya Gehlot" <divya.htco...@gmail.com> wrote: > Hi, > > Is it necessary to import sqlContext.implicits._ whenever define and > call UDF in Spark. > > > Thanks, > Divya > > >

Calling udf in Spark

2016-09-08 Thread Divya Gehlot
Hi, Is it necessary to import sqlContext.implicits._ whenever define and call UDF in Spark. Thanks, Divya

Re: write and call UDF in spark dataframe

2016-07-21 Thread Kabeer Ahmed
Divya: https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html The link gives a complete example of registering a udAf - user defined aggregate function. This is a complete example and this example should give you a

Re: write and call UDF in spark dataframe

2016-07-21 Thread Jacek Laskowski
On Thu, Jul 21, 2016 at 5:53 AM, Mich Talebzadeh wrote: > something similar Is this going to be in Scala? > def ChangeToDate (word : String) : Date = { > //return > TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(word,"dd/MM/"),"-MM-dd")) > val d1 =

Re: write and call UDF in spark dataframe

2016-07-21 Thread Jacek Laskowski
On Thu, Jul 21, 2016 at 4:53 AM, Divya Gehlot wrote: > To be very specific I am looking for UDFs syntax for example which takes > String as parameter and returns integer .. how do we define the return type val f: String => Int = ??? val myUDF = udf(f) or val myUDF =

Re: write and call UDF in spark dataframe

2016-07-21 Thread Jacek Laskowski
On Wed, Jul 20, 2016 at 1:22 PM, Rishabh Bhardwaj wrote: > val new_df = df.select(from_unixtime($"time").as("newtime")) or better yet using tick (less typing and more prose than code :)) df.select(from_unixtime('time) as "newtime") Jacek

Re: write and call UDF in spark dataframe

2016-07-20 Thread Mich Talebzadeh
block box. >> >> Andy >> >> From: Rishabh Bhardwaj <rbnex...@gmail.com> >> Date: Wednesday, July 20, 2016 at 4:22 AM >> To: Rabin Banerjee <dev.rabin.baner...@gmail.com> >> Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark&

Re: write and call UDF in spark dataframe

2016-07-20 Thread Divya Gehlot
2 AM > To: Rabin Banerjee <dev.rabin.baner...@gmail.com> > Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark" < > user@spark.apache.org> > Subject: Re: write and call UDF in spark dataframe > > Hi Divya, > > There is already "from_unixtime&qu

Re: write and call UDF in spark dataframe

2016-07-20 Thread Andy Davidson
Rabin Banerjee <dev.rabin.baner...@gmail.com> Cc: Divya Gehlot <divya.htco...@gmail.com>, "user @spark" <user@spark.apache.org> Subject: Re: write and call UDF in spark dataframe > Hi Divya, > > There is already "from_unixtime" exists in org.apache.s

Re: write and call UDF in spark dataframe

2016-07-20 Thread Mich Talebzadeh
yep something in line of val df = sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/ HH:mm:ss.ss') as time ") Note that this does not require a column from an already existing table. HTH Dr Mich Talebzadeh LinkedIn *

Re: write and call UDF in spark dataframe

2016-07-20 Thread Rishabh Bhardwaj
Hi Divya, There is already "from_unixtime" exists in org.apache.spark.sql.frunctions, Rabin has used that in the sql query,if you want to use it in dataframe DSL you can try like this, val new_df = df.select(from_unixtime($"time").as("newtime")) Thanks, Rishabh. On Wed, Jul 20, 2016 at 4:21

Re: write and call UDF in spark dataframe

2016-07-20 Thread Rabin Banerjee
Hi Divya , Try, val df = sqlContext.sql("select from_unixtime(ts,'-MM-dd') as `ts` from mr") Regards, Rabin On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot wrote: > Hi, > Could somebody share example of writing and calling udf which converts > unix tme stamp to

write and call UDF in spark dataframe

2016-07-20 Thread Divya Gehlot
Hi, Could somebody share example of writing and calling udf which converts unix tme stamp to date tiime . Thanks, Divya

How to use Spark scala custom UDF in spark sql CLI or beeline client

2016-07-17 Thread pooja mehta
Hi, How to Use Spark scala custom UDF in spark sql CLI or Beeline client. with sqlContext we can register a UDF like this: sqlContext.udf.register("sample_fn", sample_fn _ ) What is the way to use UDF in Spark sql CLI or beeline client. Thanks Pooja

how to use udf in spark thrift server.

2016-04-08 Thread zhanghn
I want to define some UDFs in my spark ENV. And server it in thrift server. So I can use these UDFs in my beeline connection. At first I tried start it with udf-jars and create functions in hive. In spark-sql , I can add temp functions like "CREATE TEMPORARY FUNCTION bsdUpper AS

Re: How to use registered Hive UDF in Spark DataFrame?

2015-10-04 Thread Umesh Kacha
o we use it? I know >>>> how to >>>> use it in a sql and it works fine >>>> >>>> hiveContext.sql(select MyUDF("test") from myTable); >>>> >>>> My hiveContext.sql() query invo

How to use registered Hive UDF in Spark DataFrame?

2015-10-02 Thread unk1102
groupby(""col1","col2","coln").count(); Can we do the follwing dataframe.select(MyUDF("col1"))??? Please guide. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-registered-Hive-UDF-in-Spark

Re: How to use registered Hive UDF in Spark DataFrame?

2015-10-02 Thread Michael Armbrust
ot;col1","col2","coln").count(); > > Can we do the follwing dataframe.select(MyUDF("col1"))??? Please guide. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-registered-Hive-UDF-in

Re: How to use registered Hive UDF in Spark DataFrame?

2015-10-02 Thread Umesh Kacha
ose I am trying to convert this query into DataFrame APIs >> >> >> dataframe.select("col1","col2","coln").groupby(""col1","col2","coln").count(); >> >> C

Re: How to use registered Hive UDF in Spark DataFrame?

2015-10-02 Thread Michael Armbrust
l() query involves group by on multiple columns so for >>> scaling purpose I am trying to convert this query into DataFrame APIs >>> >>> >>> dataframe.select("col1","col2","coln").groupby(""col1","col2

How to Hive UDF in Spark DataFrame?

2015-09-13 Thread unk1102
Hi I am using UDF in hiveContext.sql("") query inside it uses group by which forces huge data shuffle read of around 30 GB I am thinking to convert above query into DataFrame so that I avoid using group by. How do we use Hive UDF in Spark DataFrame? Please guide. Thanks much.

Re: UDF in spark

2015-07-08 Thread VISHNU SUBRAMANIAN
HI Vinod, Yes If you want to use a scala or python function you need the block of code. Only Hive UDF's are available permanently. Thanks, Vishnu On Wed, Jul 8, 2015 at 5:17 PM, vinod kumar vinodsachin...@gmail.com wrote: Thanks Vishnu, When restart the service the UDF was not accessible

Re: UDF in spark

2015-07-08 Thread vinod kumar
Thanks Vishnu, When restart the service the UDF was not accessible by my query.I need to run the mentioned block again to use the UDF. Is there is any way to maintain UDF in sqlContext permanently? Thanks, Vinod On Wed, Jul 8, 2015 at 7:16 AM, VISHNU SUBRAMANIAN johnfedrickena...@gmail.com

Re: UDF in spark

2015-07-08 Thread vinod kumar
Thank you for quick response Vishnu, I have following doubts too. 1.Is there is anyway to upload files to HDFS programattically using c# language?. 2.Is there is any way to automatically load scala block of code (for UDF) when i start the spark service? -Vinod On Wed, Jul 8, 2015 at 7:57 AM,

Re: UDF in spark

2015-07-08 Thread VISHNU SUBRAMANIAN
Hi, sqlContext.udf.register(udfname, functionname _) example: def square(x:Int):Int = { x * x} register udf as below sqlContext.udf.register(square,square _) Thanks, Vishnu On Wed, Jul 8, 2015 at 2:23 PM, vinod kumar vinodsachin...@gmail.com wrote: Hi Everyone, I am new to spark.may I

UDF in spark

2015-07-08 Thread vinod kumar
Hi Everyone, I am new to spark.may I know how to define and use User Define Function in SPARK SQL. I want to use defined UDF by using sql queries. My Environment Windows 8 spark 1.3.1 Warm Regards, Vinod

Re: Using Hive UDF in spark

2015-07-08 Thread ayan guha
You are most likely confused because you are using the UDF using HiveContext. In your case, you are using Spark UDF, not Hive UDF. For a naive scenario, I can use spark UDFs without any hive installation in my cluster. sqlContext.udf.register is for UDF in spark. Hive UDFs are stored in Hive

Running Hive UDF from spark-shell fails due to datatype issue

2014-08-05 Thread visakh
issue - http://stackoverflow.com/questions/25059527/udf-not-working-in-spark-sql) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Hive-UDF-from-spark-shell-fails-due-to-datatype-issue-tp11426.html Sent from the Apache Spark User List mailing list