prasannaP created SPARK-17373: --------------------------------- Summary: spark+hive+hbase+hbaseIntegration not working Key: SPARK-17373 URL: https://issues.apache.org/jira/browse/SPARK-17373 Project: Spark Issue Type: Bug Components: Spark Shell Reporter: prasannaP
SparkSQL+Hive+Hbase+HbaseIntegration doesn't work Hi, I am getting error when I am trying to connect hive table (which is being created through HbaseIntegration) in spark Steps I followed : *Hive Table creation code *: CREATE TABLE test.sample(id string,name string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,details:name") TBLPROPERTIES ("hbase.table.name" = "sample"); *DESCRIBE TEST ;* col_name data_type comment id string from deserializer name string from deserializer *Starting Spark shell* spark-shell --master local[2] --driver-class-path /usr/local/hive/lib/hive-hbase-handler-1.2.1.jar: /usr/local/hbase/lib/hbase-server-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-protocol-0.98.9-hadoo2.jar: /usr/local/hbase/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-hadoop-compat-0.98.9-hadoop2.jar: /usr/local/hbase/lib/hbase-client-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-common-0.98.9-hadoop2.jar: /usr/local/hbase/lib/htrace-core-2.04.jar:/usr/local/hbase/lib/hbase-common-0.98.9-hadoop2-tests.jar: /usr/local/hbase/lib/hbase-server-0.98.9-hadoop2-tests.jar:/usr/local/hive/lib/zookeeper-3.4.6.jar:/usr/local/hive/lib/guava-14.0.1.jar In spark-shell: val sqlContext=new org.apache.spark.sql.hive.HiveContext(sc) sqlContext.sql(“select count(*) from test.sample”).collect() I added this setting in hadoop-env.sh as export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_HOME/lib/* *Stack Trace* : Stack SQL context available as sqlContext. java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes Could somebody help me in resolving the error. Would really appreciate the help . Thank you. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org