Hi All

I am using CDH-5.2.1 without kerberos,
hbase-0.98.6-cdh5.2.1, hive-0.13.1-cdh5.2.1.

I use hive-cli to create a hive external table :
create external table table1(rowkey String, cfcol1 String) STORED BY
'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES
("hbase.columns.mapping" = ":key,cf:col1") TBLPROPERTIES("hbase.table.name"
= "hbasetable1");

When I run join sql : select t1.cfcol1 from table1 t1 join table1 t2 on
t1.cfcol1 = t2.cfcol1, it works on hive-cli. But when I use
org.apache.hadoop.hive.ql.Driver to run, it throws Exception :

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hbase/mapreduce/TableInputFormatBase
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:412)
at java.lang.ClassLoader.loadClass(ClassLoader.java:412)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at
org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
at
org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:238)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:226)
at
org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:745)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:113)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at
org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at
org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
at
org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:918)
at
org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:826)
at
org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:840)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.main(ExecDriver.java:733)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 43 more


Here is my java code :

        CliSessionState ss = new CliSessionState(new
HiveConf(SessionState.class));
        HiveConf hiveConf = ss.getConf();

        hiveConf.set("hbase.zookeeper.quorum", "host1");
        hiveConf.set("hbase.zookeeper.property.clientPort", "2181");
        hiveConf.set("fs.default.name", "hdfs://host1:8020");
        hiveConf.set("yarn.resourcemanager.address", "host1:8032");
        hiveConf.set("yarn.resourcemanager.scheduler.address",
"host1:8030");
        hiveConf.set("yarn.resourcemanager.resource-tracker.address",
"host1:8031");
        hiveConf.set("yarn.resourcemanager.admin.address", "host1:8033");
        hiveConf.set("mapreduce.framework.name", "yarn");
        hiveConf.set("mapreduce.jobhistory.address", "host1:10020");
        hiveConf.set("yarn.nodemanager.aux-services", "mapreduce_shuffle");
        hiveConf.set("yarn.application.classpath",
"$HADOOP_CLIENT_CONF_DIR,$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_YARN_HOME/*,$HADOOP_YARN_HOME/lib/*,$HADOOP_MAPRED_HOME/*,$HADOOP_MAPRED_HOME/lib/*");
        hiveConf.set("hive.aux.jars.path",
"file:///opt/cloudera/parcels/CDH/lib/hive/lib/hbase-server.jar");
        hiveConf.set("javax.jdo.option.ConnectionURL", "jdbc:postgresql://
192.168.1.211:7432/hive2");
        hiveConf.set("javax.jdo.option.ConnectionDriverName",
"org.postgresql.Driver");
        hiveConf.set("javax.jdo.option.ConnectionUserName", "hive2");
        hiveConf.set("javax.jdo.option.ConnectionPassword", "3t4KTtyCXZ");
        hiveConf.set("hive.dbname", "hive2");
        hiveConf.set("hbase.client.retries.number", "1");
        SessionState.start(ss);
        Driver dd = new Driver(hiveConf);
        CommandProcessorResponse res = dd.run(" select t1.cfcol1
from table1 t1 join table1 t2 on t1.cfcol1 = t2.cfcol1");


Anyone can help ?

Reply via email to