Hi,

I am trying to set the hive metadata destination to a mysql database in
hive context, it works fine in spark 1.3.1, but it seems broken in spark
1.4.1-rc1, where it always connect to the default metadata: local), is this
a regression or we must set the connection in hive-site.xml?

The code is very simple in spark shell:
   * import org.apache.spark.sql.hive._*
*    val hiveContext = new HiveContext(sc)*
*    hiveContext.setConf("javax.jdo.option.ConnectionDriveName",
"com.mysql.jdbc.Driver")*
*    hiveContext.setConf("javax.jdo.option.ConnectionUserName", "hive")*
*    hiveContext.setConf("javax.jdo.option.ConnectionPassword", "hive")*
*    hiveContext.setConf("javax.jdo.option.ConnectionURL",
"jdbc:mysql://10.111.3.186:3306/hive <http://10.111.3.186:3306/hive>")*
*    hiveContext.setConf("hive.metastore.warehouse.dir",
"/user/hive/warehouse")*
*    hiveContext.sql("select * from mysqltable").show()*

*Thanks!*
*-Terry*

Reply via email to