Re: Spark SQL Thriftserver and Hive UDF in Production

2015-10-19 Thread Deenar Toraskar
Reece

You can do the following. Start the spark-shell. Register the UDFs in the
shell using sqlContext, then start the Thrift Server using startWithContext
from the spark shell: https://github.com/apache/spark/blob/master/sql/hive-
thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver
/HiveThriftServer2.scala#L56



Regards
Deenar

On 19 October 2015 at 04:42, Mohammed Guller <moham...@glassbeam.com> wrote:

> Have you tried registering the function using the Beeline client?
>
> Another alternative would be to create a Spark SQL UDF and launch the
> Spark SQL Thrift server programmatically.
>
> Mohammed
>
> -Original Message-
> From: ReeceRobinson [mailto:re...@therobinsons.gen.nz]
> Sent: Sunday, October 18, 2015 8:05 PM
> To: user@spark.apache.org
> Subject: Spark SQL Thriftserver and Hive UDF in Production
>
> Does anyone have some advice on the best way to deploy a Hive UDF for use
> with a Spark SQL Thriftserver where the client is Tableau using Simba ODBC
> Spark SQL driver.
>
> I have seen the hive documentation that provides an example of creating
> the function using a hive client ie: CREATE FUNCTION myfunc AS 'myclass'
> USING JAR 'hdfs:///path/to/jar';
>
> However using Tableau I can't run this create function statement to
> register my UDF. Ideally there is a configuration setting that will load my
> UDF jar and register it at start-up of the thriftserver.
>
> Can anyone tell me what the best option if it is possible?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Thriftserver-and-Hive-UDF-in-Production-tp25114.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
> commands, e-mail: user-h...@spark.apache.org
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Spark SQL Thriftserver and Hive UDF in Production

2015-10-19 Thread Todd Nist
>From tableau, you should be able to use the Initial SQL option to support
this:

So in Tableau add the following to the “Initial SQL”

create function myfunc AS 'myclass'
using jar 'hdfs:///path/to/jar';



HTH,
Todd


On Mon, Oct 19, 2015 at 11:22 AM, Deenar Toraskar <deenar.toras...@gmail.com
> wrote:

> Reece
>
> You can do the following. Start the spark-shell. Register the UDFs in the
> shell using sqlContext, then start the Thrift Server using startWithContext
> from the spark shell:
> https://github.com/apache/spark/blob/master/sql/hive-thriftserver
> /src/main/scala/org/apache/spark/sql/hive/thriftserver
> /HiveThriftServer2.scala#L56
>
>
>
> Regards
> Deenar
>
> On 19 October 2015 at 04:42, Mohammed Guller <moham...@glassbeam.com>
> wrote:
>
>> Have you tried registering the function using the Beeline client?
>>
>> Another alternative would be to create a Spark SQL UDF and launch the
>> Spark SQL Thrift server programmatically.
>>
>> Mohammed
>>
>> -Original Message-
>> From: ReeceRobinson [mailto:re...@therobinsons.gen.nz]
>> Sent: Sunday, October 18, 2015 8:05 PM
>> To: user@spark.apache.org
>> Subject: Spark SQL Thriftserver and Hive UDF in Production
>>
>> Does anyone have some advice on the best way to deploy a Hive UDF for use
>> with a Spark SQL Thriftserver where the client is Tableau using Simba ODBC
>> Spark SQL driver.
>>
>> I have seen the hive documentation that provides an example of creating
>> the function using a hive client ie: CREATE FUNCTION myfunc AS 'myclass'
>> USING JAR 'hdfs:///path/to/jar';
>>
>> However using Tableau I can't run this create function statement to
>> register my UDF. Ideally there is a configuration setting that will load my
>> UDF jar and register it at start-up of the thriftserver.
>>
>> Can anyone tell me what the best option if it is possible?
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Thriftserver-and-Hive-UDF-in-Production-tp25114.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
>> commands, e-mail: user-h...@spark.apache.org
>>
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Spark SQL Thriftserver and Hive UDF in Production

2015-10-18 Thread ReeceRobinson
Does anyone have some advice on the best way to deploy a Hive UDF for use
with a Spark SQL Thriftserver where the client is Tableau using Simba ODBC
Spark SQL driver.

I have seen the hive documentation that provides an example of creating the
function using a hive client ie: CREATE FUNCTION myfunc AS 'myclass' USING
JAR 'hdfs:///path/to/jar';

However using Tableau I can't run this create function statement to register
my UDF. Ideally there is a configuration setting that will load my UDF jar
and register it at start-up of the thriftserver.

Can anyone tell me what the best option if it is possible?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Thriftserver-and-Hive-UDF-in-Production-tp25114.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Spark SQL Thriftserver and Hive UDF in Production

2015-10-18 Thread Mohammed Guller
Have you tried registering the function using the Beeline client?

Another alternative would be to create a Spark SQL UDF and launch the Spark SQL 
Thrift server programmatically.

Mohammed

-Original Message-
From: ReeceRobinson [mailto:re...@therobinsons.gen.nz] 
Sent: Sunday, October 18, 2015 8:05 PM
To: user@spark.apache.org
Subject: Spark SQL Thriftserver and Hive UDF in Production

Does anyone have some advice on the best way to deploy a Hive UDF for use with 
a Spark SQL Thriftserver where the client is Tableau using Simba ODBC Spark SQL 
driver.

I have seen the hive documentation that provides an example of creating the 
function using a hive client ie: CREATE FUNCTION myfunc AS 'myclass' USING JAR 
'hdfs:///path/to/jar';

However using Tableau I can't run this create function statement to register my 
UDF. Ideally there is a configuration setting that will load my UDF jar and 
register it at start-up of the thriftserver.

Can anyone tell me what the best option if it is possible?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Thriftserver-and-Hive-UDF-in-Production-tp25114.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org