[ https://issues.apache.org/jira/browse/SPARK-12752?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15210431#comment-15210431 ]
Wojciech Indyk edited comment on SPARK-12752 at 3/24/16 3:47 PM: ----------------------------------------------------------------- For HBase-handler I've created an another issue: SPARK-14115 was (Author: woj_in): For HBase-handler I've created another issue: SPARK-14115 > Can Thrift Server connect to Hive Metastore? > -------------------------------------------- > > Key: SPARK-12752 > URL: https://issues.apache.org/jira/browse/SPARK-12752 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.0 > Reporter: Tao Wang > Attachments: JDBCServer.log > > > Before we use Thrift Server directly connecting to database such as mysql to > store its metadata. Now we wanna read data stored by Hive so we try to > connect to Hive Metastore with Thrift Server. > In non-secure mode it is ok with me by setting `hive.metastore.uris` to > thrift url of Hive Metastore. But when testing it in secure cluster we met > some problem kerberos related. > The error message is showed as attached log. The sql statement is "create > table t1 (name string)" which is handled by HiveQl. > The new added configuration for security is like: > <property> > <name>hive.metastore.uris</name> > <value>thrift://9.96.1.116:21088,thrift://9.96.1.115:21088,thrift://9.96.1.114:21088</value> > </property> > <property> > <name>hive.metastore.sasl.enabled</name> > <value>true</value> > </property> > <property> > <name>hive.metastore.kerberos.principal</name> > <value>hive/hadoop.hadoop....@hadoop.com</value> > </property> > I don't understand too much about Hive. But technically I think it should > work with this mode. Please guys who has experience can give some advise. > Thanks :) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org