Hello guys,

I'm using Spark SQL with Hive thru Thrift.
I need this because I need to create a table by table mask.
Here is an example:
1. Take tables by mask, like SHOW TABLES IN db 'table__*'
2. Create query like:
CREATE TABLE total_data AS
SELECT * FROM table__1
UNION ALL
SELECT * FROM table__2
UNION ALL
SELECT * FROM table__3

Due to this, i need to create JDBC connection inside UDF, problem is, that
i need to create connection dynamically, that means, that i need to take
host name, port and user name from hive properties, this's easy from hive,
i'm using properties:
host - hive.server2.thrift.bind.host
port - hive.server2.thrift.port
user - user always the same as ran UDF

but, problem is that hive.server2.thrift.bind.host parameter not defined in
Yarn, and user that ran UDF is hive.
Maybe you have solution, how i can get host name, and more important thing
- how i can run UDF from user that ran SQL(not user hive).

Reply via email to