Reece

You can do the following. Start the spark-shell. Register the UDFs in the
shell using sqlContext, then start the Thrift Server using startWithContext
from the spark shell: https://github.com/apache/spark/blob/master/sql/hive-
thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver
/HiveThriftServer2.scala#L56



Regards
Deenar

On 19 October 2015 at 04:42, Mohammed Guller <moham...@glassbeam.com> wrote:

> Have you tried registering the function using the Beeline client?
>
> Another alternative would be to create a Spark SQL UDF and launch the
> Spark SQL Thrift server programmatically.
>
> Mohammed
>
> -----Original Message-----
> From: ReeceRobinson [mailto:re...@therobinsons.gen.nz]
> Sent: Sunday, October 18, 2015 8:05 PM
> To: user@spark.apache.org
> Subject: Spark SQL Thriftserver and Hive UDF in Production
>
> Does anyone have some advice on the best way to deploy a Hive UDF for use
> with a Spark SQL Thriftserver where the client is Tableau using Simba ODBC
> Spark SQL driver.
>
> I have seen the hive documentation that provides an example of creating
> the function using a hive client ie: CREATE FUNCTION myfunc AS 'myclass'
> USING JAR 'hdfs:///path/to/jar';
>
> However using Tableau I can't run this create function statement to
> register my UDF. Ideally there is a configuration setting that will load my
> UDF jar and register it at start-up of the thriftserver.
>
> Can anyone tell me what the best option if it is possible?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Thriftserver-and-Hive-UDF-in-Production-tp25114.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
> commands, e-mail: user-h...@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to