Ok, so we found that we are lacking some jars in our /usr/lib/hadoop/lib
folder. I posted the answer here :
http://stackoverflow.com/questions/27374810/my-cdh5-2-cluster-get-filenotfoundexception-when-running-hbase-mr-jobs/27501623#27501623
On Thu, Dec 11, 2014 at 2:14 PM, Ehud Lev e...@gigya
Are you using yarn? If yes, can you try yarn jar
/path/to/hbase-server.jar rowcounter 't1' and see if that works?
I guess this is an hbase environment issue, but I can't put my finger on
it.
Hi , Bharath
I get the same error from both of them ('hadoop jar' and 'yarn jar').
the dependency on your local
file system into HDFS (the error is because that JAR is not there).
-Dima
On Tue, Dec 9, 2014 at 1:05 AM, Ehud Lev e...@gigya-inc.com wrote:
My cdh5.2 cluster has a problem to run hbase MR jobs.
For example, I added the hbase classpath into the hadoop classpath:
vi
My cdh5.2 cluster has a problem to run hbase MR jobs.
For example, I added the hbase classpath into the hadoop classpath:
vi /etc/hadoop/conf/hadoop-env.sh
add the line:
export HADOOP_CLASSPATH=/usr/lib/hbase/bin/hbase
classpath:$HADOOP_CLASSPATH
And when I am running:
hadoop jar