What version of Hadoop are you running? There are many erroneous
instructions for how to get this up and running all over the internet.
You do not need to rebuild hive in order to get it to work. You only
need to do the following:
1. It will only work if HBase is running in distributed or
pseu
Are you sure your configuration is correct? There's a couple of other
things that you need to do too:
Make sure:
dfs.support.append
true
Is set for both HBase and Hadoop
Update hadoop jars inside the hbase lib directory to use the hadoop
0.20.205 core jar files.
Use 0.20.205.
Original Message
Subject: What version of HDFS is compatible with HBase?
From: Aleksandr Levchuk
Date: Wed, December 21, 2011 3:23 pm
To:
Hi all,
HBase stable http://apache.cs.utah.edu/hbase/stable/ is currently
hbase-0.90.4, what version(s) of HD
Awesome. I started it again and I got the commons-configuration problem.
It's a bit inexplicable as the jar files have been sitting there for
sometime. Whatever. I'm just glad it works. Thanks for your help!
Original Message
Subject: Re: EOFException in HBase 0.94
From: Je
>> What doc you looking at?
Pick one. There's a dozen or so HBase tutorials published on the web,
none of which will leave the reader with a working HBase instance.
Mostly though, I've been trying to get this to work:
http://hbase.apache.org/book/hadoop.html
I've also read the O'Reilly book
Yes. HDFS is up and running, but at soon as HBase connects to it, it
gets the EOF exception and immediately shuts down the master. Most of
the documentation for getting HBase running seems out of date as it
refers to a namenode server running on 8020. There is nothing running on
8020. The namenod
Hi I am trying to get HBase running in pseudo-distributed mode. It
appears to work when I use standalone mode but in pseudo distributed
mode I get:
java.io.IOException: Call to localhost/127.0.0.1:9000 failed on local
exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapExce