I am using hbase -0.94.8.
On Wed, Apr 29, 2015 at 11:56 PM, Ted Yu yuzhih...@gmail.com wrote:
Can you enable HBase DEBUG logging in log4j.properties so that we can have
more clue ?
What hbase release are you using ?
Cheers
On Wed, Apr 29, 2015 at 4:27 AM, Saurabh Gupta
Now able to solve the issue by setting
SparkConf sconf = *new* SparkConf().setAppName(“App).setMaster(local)
and
conf.set(“zookeeper.znode.parent”, “/hbase-unsecure”)
Standalone hbase has a table 'test'
hbase(main):001:0 scan 'test'
ROW COLUMN+CELL
row1
The error indicates incompatible protobuf versions.
Please take a look at 4.1.1 under
http://hbase.apache.org/book.html#basic.prerequisites
Cheers
On Thu, Apr 30, 2015 at 3:49 AM, Saurabh Gupta saurabh.gu...@semusi.com
wrote:
Now able to solve the issue by setting
SparkConf sconf = *new*
Looking at your classpath, it looks like you've compiled Spark yourself.
Depending on which version of Hadoop you've compiled against (looks like
it's Hadoop 2.2 in your case), Spark will have its own version of
protobuf. You should try by making sure both your HBase and Spark are
compiled
Hi,
I am working with standalone HBase. And I want to execute HBaseTest.scala
(in scala examples) .
I have created a test table with three rows and I just want to get the
count using HBaseTest.scala
I am getting this issue:
15/04/29 11:17:10 INFO BlockManagerMaster: Registered BlockManager