Hi Guys,

Eventually, i got success to solve that issue. As you know, issue was
related with map-reduce jobs. Error was clearing saying, there was some
jar(class) is missing. So, i have create a temporary folder and copied some
jars file there. Fortunately, it's work for me. Kindly fine below steps
that i have done for it.

Have just copied below jar file in *auxlib *directory.  Parent directory
/usr/lib/hadoop-mapreduce(library path ) is already defined in
mapreduce-xml file.

[root@new ~]# ls /usr/lib/hadoop-mapreduce/auxlib/
guava-12.0.1.jar
 hbase-protocol-0.96.0.2.0.6.0-76-hadoop2.jar  htrace-core-2.01.jar
hbase-client-0.96.0.2.0.6.0-76-hadoop2.jar
 hbase-server-0.96.0.2.0.6.0-76-hadoop2.jar
 zookeeper-3.4.5.2.0.6.0-76.jar
hbase-common-0.96.0.2.0.6.0-76-hadoop2.jar
 hive-hbase-handler-0.12.0.2.0.6.0-76.jar



On Thu, Dec 26, 2013 at 10:38 PM, Vikas Parashar <para.vi...@gmail.com>wrote:

> Hi buddy,
>
> have gone through with given link. But no luck..
>
> Actually i am working on HDP. Might be, there is some issue with yarn and
> MR.
>
>
> On Thu, Dec 26, 2013 at 10:10 PM, kulkarni.swar...@gmail.com <
> kulkarni.swar...@gmail.com> wrote:
>
>> Seems like you are running hive on yarn instead of mr1. I have had some
>> issues in the past doing so. The post here[1] has some solutions on how to
>> configure hive ot work with yarn. Hope that helps.
>>
>> [1]
>> https://groups.google.com/a/cloudera.org/forum/#!topic/cdh-user/gHVq9C5H6RE
>>
>>
>> On Thu, Dec 26, 2013 at 10:35 AM, Vikas Parashar <para.vi...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> I am integrating hive(0.12) with hbase(0.96). Everything is working fine
>>> there but get stuck between two quires.
>>>
>>> When i create table or select * from table then it's working fine .
>>> but in case of select count(*) from table it give me below error.
>>>
>>>
>>> 2013-12-26 13:25:01,864 ERROR ql.Driver
>>> (SessionState.java:printError(419)) - FAILED: Execution Error, return code
>>> 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
>>> 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
>>> (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
>>> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
>>> 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
>>> (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option
>>> parsing not performed. Implement the Tool interface and execute your
>>> application with ToolRunner to remedy this.
>>> 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
>>> (AbstractCounters.java:getGroup(234)) - Group
>>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>>> org.apache.hadoop.mapreduce.TaskCounter instead
>>> 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
>>> (AbstractCounters.java:getGroup(234)) - Group
>>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>>> org.apache.hadoop.mapreduce.TaskCounter instead
>>> 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
>>> (AbstractCounters.java:getGroup(234)) - Group
>>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>>> org.apache.hadoop.mapreduce.TaskCounter instead
>>> 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
>>> (AbstractCounters.java:getGroup(234)) - Group
>>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>>> org.apache.hadoop.mapreduce.TaskCounter instead
>>> 2013-12-26 14:27:32,528 ERROR exec.Task
>>> (SessionState.java:printError(419)) - Ended Job = job_1388037394132_0013
>>> with errors
>>> 2013-12-26 14:27:32,530 ERROR exec.Task
>>> (SessionState.java:printError(419)) - Error during job, obtaining debugging
>>> information...
>>> 2013-12-26 14:27:32,538 ERROR exec.Task
>>> (SessionState.java:printError(419)) - Examining task ID:
>>> task_1388037394132_0013_m_000000 (and more) from job job_1388037394132_0013
>>> 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
>>> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
>>> TaskLogServlet is not supported in MR2 mode.
>>> 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
>>> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
>>> TaskLogServlet is not supported in MR2 mode.
>>> 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
>>> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
>>> TaskLogServlet is not supported in MR2 mode.
>>> 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
>>> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
>>> TaskLogServlet is not supported in MR2 mode.
>>> 2013-12-26 14:27:32,615 ERROR exec.Task
>>> (SessionState.java:printError(419)) -
>>> Task with the most failures(4):
>>> -----
>>> Task ID:
>>>   task_1388037394132_0013_m_000000
>>>
>>> URL:
>>>
>>> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
>>> -----
>>> Diagnostic Messages for this Task:
>>> Error: java.io.IOException: java.io.IOException:
>>> java.lang.reflect.InvocationTargetException
>>> at
>>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
>>>  at
>>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
>>> at
>>> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
>>>  at
>>> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
>>> at
>>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
>>>  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>>>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
>>> Caused by: java.io.IOException:
>>> java.lang.reflect.InvocationTargetException
>>> at
>>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
>>>  at
>>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
>>> at
>>> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
>>>  at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
>>> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
>>>  at
>>> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
>>> at
>>> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
>>>  ... 9 more
>>> Caused by: java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>  at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>> at
>>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
>>>  ... 15 more
>>> Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
>>> at
>>> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
>>>  at
>>> org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
>>> at
>>> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
>>>  at
>>> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
>>> at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
>>>  at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
>>> ... 20 more
>>> Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>  ... 26 more
>>>
>>>
>>> 2013-12-26 14:27:32,870 ERROR ql.Driver
>>> (SessionState.java:printError(419)) - FAILED: Execution Error, return code
>>> 2 from org.apache.
>>>
>>> I think this error is related with mapred job. Whenever my query use the
>>> map-R then i get error.
>>>
>>> Any idea!!
>>>
>>
>>
>>
>> --
>> Swarnim
>>
>
>

Reply via email to