Spark 1.6 works fine with Hive 2. I did not know there was a restriction
there.

I assume you are talking about Spark using Hive metastore?


Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java
1.7.0_25)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
SQL context available as sqlContext.
scala> sql("show databases").collect.foreach(println)
[accounts]
[asehadoop]
[default]
[iqhadoop]
[mytable_db]
[oraclehadoop]
[test]


Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 1 April 2016 at 04:47, guoqing0...@yahoo.com.hk.INVALID <
guoqing0...@yahoo.com.hk.invalid> wrote:

> Hi, I'd like to know is the SPARK-1.6 only support the hive-0.13 or can
> build with higher versions like 1.x ?
>
> ------------------------------
> guoqing0...@yahoo.com.hk
>

Reply via email to