Ok, so we found that we are lacking some jars in our /usr/lib/hadoop/lib
folder. I posted the answer here :
http://stackoverflow.com/questions/27374810/my-cdh5-2-cluster-get-filenotfoundexception-when-running-hbase-mr-jobs/27501623#27501623
On Thu, Dec 11, 2014 at 2:14 PM, Ehud Lev wrote:
>
>
>>
>
>
> Are you using yarn? If yes, can you try "yarn jar
> /path/to/hbase-server.jar rowcounter 't1'" and see if that works?
>
>
>> I guess this is an hbase environment issue, but I can't put my finger on
>> it.
>>
>>
>> Hi , Bharath
>
>
I get the same error from both of them ('hadoop jar' and 'yar
> > -- Forwarded message ------
> > From: Dima Spivak
> > Date: Tue, Dec 9, 2014 at 11:23 PM
> > Subject: Re: My cdh5.2 cluster get FileNotFoundException when running
> > hbase MR jobs
> > To: "user@hbase.apache.org"
> > Cc: Yaniv
---
> From: Dima Spivak
> Date: Tue, Dec 9, 2014 at 11:23 PM
> Subject: Re: My cdh5.2 cluster get FileNotFoundException when running
> hbase MR jobs
> To: "user@hbase.apache.org"
> Cc: Yaniv Yancovich
>
>
> Dear Ehud,
>
> You need the -libjars argument
Dear Ehud,
You need the -libjars argument to move the dependency on your local
file system into HDFS (the error is because that JAR is not there).
-Dima
On Tue, Dec 9, 2014 at 1:05 AM, Ehud Lev wrote:
> My cdh5.2 cluster has a problem to run hbase MR jobs.
>
> For example, I added the hbase c
My cdh5.2 cluster has a problem to run hbase MR jobs.
For example, I added the hbase classpath into the hadoop classpath:
vi /etc/hadoop/conf/hadoop-env.sh
add the line:
export HADOOP_CLASSPATH="/usr/lib/hbase/bin/hbase
classpath:$HADOOP_CLASSPATH"
And when I am running:
hadoop jar /usr/lib/hbase