EXPLAIN over JDBC

2010-11-23 Thread Peter Hall
Hi, When running a query like: EXPLAIN SELECT * FROM mytable over the JDBC driver, calling ResultSet.next() returns false at the end of the abstract syntax tree, even though there are more rows to fetch. So typical JDBC processing code of the form: while (rs.next()) { //process a row } stops b

change Pre/Post Query Hooks to take only 1 parameter: HookContext

2010-11-23 Thread Liyin Tang
Hi All Before Hive-1785 committed, the old hook interface will take 4 or 5 parameters to run, which is not scalable for adding more complicated hooks. Hooks may need more information than these 4 or 5 parameters. Hive should support to add more hooks without changing the existing hook interface. So

Re: Querying HBase

2010-11-23 Thread John Sichi
It looks a bit like this one where ISCOMPRESSED was used instead of IS_COMPRESSED: https://issues.apache.org/jira/browse/HIVE-1435 Maybe your datanucleus.identifierFactory is somehow misconfigured? JVS On Nov 23, 2010, at 4:16 PM, Xavier Stevens wrote: > I'm trying to create an external table

Querying HBase

2010-11-23 Thread Xavier Stevens
I'm trying to create an external table to a pre-existing HBase table using Hive trunk, like so: CREATE EXTERNAL TABLE hbase_metrics (key string, value map) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,counters:") TBLPROPERTIES (

Re: Hive produces very small files despite hive.merge...=true settings

2010-11-23 Thread Ning Zhang
This should be expected. Compressed text files are not splittable so that CombineHiveInputFormat cannot read multiple files per mapper. CombinedHiveInputFormat is used when hive.merge.maponly=true. If you set it to false, we'll use HiveInputFormat and that should be able to merge compressed tex