Hi,
When running a query like:
EXPLAIN SELECT * FROM mytable
over the JDBC driver, calling ResultSet.next() returns false at the end of the
abstract syntax tree, even though there are more rows to fetch.
So typical JDBC processing code of the form:
while (rs.next()) {
//process a row
}
stops b
Hi All
Before Hive-1785 committed, the old hook interface will take 4 or 5
parameters to run, which is not scalable for adding more complicated hooks.
Hooks may need more information than these 4 or 5 parameters.
Hive should support to add more hooks without changing the existing hook
interface.
So
It looks a bit like this one where ISCOMPRESSED was used instead of
IS_COMPRESSED:
https://issues.apache.org/jira/browse/HIVE-1435
Maybe your datanucleus.identifierFactory is somehow misconfigured?
JVS
On Nov 23, 2010, at 4:16 PM, Xavier Stevens wrote:
> I'm trying to create an external table
I'm trying to create an external table to a pre-existing HBase table
using Hive trunk, like so:
CREATE EXTERNAL TABLE hbase_metrics (key string, value map)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH
SERDEPROPERTIES ("hbase.columns.mapping" = ":key,counters:")
TBLPROPERTIES (
This should be expected. Compressed text files are not splittable so that
CombineHiveInputFormat cannot read multiple files per mapper.
CombinedHiveInputFormat is used when hive.merge.maponly=true. If you set it to
false, we'll use HiveInputFormat and that should be able to merge compressed
tex