Re: 回复: Spark 1.6.0 + Hive + HBase

2016-01-28 Thread Ted Yu
ke";<jornfra...@gmail.com>; > *发送时间:* 2016年1月28日(星期四) 晚上9:09 > *收件人:* "开心延年"<muyann...@qq.com>; > *抄送:* "Julio Antonio Soto de Vicente"<ju...@esbet.es>; "Maciej Bryński"< > mac...@brynski.pl>; "Ted Yu"&l

回复: 回复: Spark 1.6.0 + Hive + HBase

2016-01-28 Thread 开心延年
to de Vicente"<ju...@esbet.es>; "Maciej Bryński"<mac...@brynski.pl>; "dev"<dev@spark.apache.org>; 主题: Re: 回复: Spark 1.6.0 + Hive + HBase Under sql/hive/src/main/scala/org/apache/spark/sql/hive/execution , I only see HiveTableScan and HiveNativeCom

Re: 回复: Spark 1.6.0 + Hive + HBase

2016-01-28 Thread Jörn Franke
Probably a newer Hive version makes a lot of sense here - at least 1.2.1. What storage format are you using? I think the old Hive version had a bug where it always scanned all partitions unless you limit it in the on clause of the query to a certain partition (eg on date=20201119) > On 28 Jan

回复: 回复: Spark 1.6.0 + Hive + HBase

2016-01-28 Thread 开心延年
Vicente"<ju...@esbet.es>; "Maciej Bryński"<mac...@brynski.pl>; "Ted Yu"<yuzhih...@gmail.com>; "dev"<dev@spark.apache.org>; 主题: Re: 回复: Spark 1.6.0 + Hive + HBase Probably a newer Hive version makes a lot of sense here - at least