Hi stuti,
i found the same problem, so i thought of the solution is, i added
zookeeper-3.3.2.jar and log4j-1.2.5.jar on  classpath of HBASE. it worked
fine.
thank you stuti.

While executing one more program on mapreduce i am getting following error
on console of eclipse, though i added common-cli-1.2.jar on its classpath
in the hbase-env.sh of hbase.

11/12/16 10:45:15 WARN mapred.LocalJobRunner: job_local_0001
java.lang.NoClassDefFoundError: org/apache/commons/httpclient/HttpMethod
    at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:242)
Caused by: java.lang.ClassNotFoundException:
org.apache.commons.httpclient.HttpMethod
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 1 more

what could be the problem..? please help..



On Fri, Dec 16, 2011 at 10:16 AM, Stuti Awasthi <stutiawas...@hcl.com>wrote:

> Hi Vamshi,
> Are you sure you have added the relevant jars in your classpath correctly.
> Error is : Caused by: java.lang.ClassNotFoundException:
> org.apache.zookeeper.KeeperException
> Your code is not able to find proper jars in path.
>
> Hope this helps!
>
> -----Original Message-----
> From: Vamshi Krishna [mailto:vamshi2...@gmail.com]
> Sent: Wednesday, December 14, 2011 8:14 PM
> To: user@hbase.apache.org
> Subject: Re: No changes or progress status on web UI during mapreduce
> program running
>
> Hi, thank you. all these days i am coding in eclipse and trying to run
> that program from eclipse only, but never i saw that program running on the
> cluster , only it is running on the LocalJobRunner, even though i set
> config.set("mapred.job.tracker", "jthost:port");
>
> Now i realized on thing. just correct me if i am wrong. " write code in
> the eclipse, then build it, then jar it, then run it through command line
> in the hadoop home folder  "
>
>  for example: {Hadoop_home}/bin/hadoop jar project.jar program_name -jt
> <host:port> argument_1 argument_2 ..
> Is it the correct way ? please correct if i am wrong.
>
> Now, i did the same thing as i mentioned in the above lines. i started
> running the program from one of the datanode machine(one of the machines in
> my 2 node cluster), now i observed that program is running on the cluster,
> i specified 4 map tasks, 2 reduce tasks. But out of 4 map tasks, only those
> 2 tasks are running on the datanode machine, but the other 2 map tasks
> submitted to namenode machine are not running, . I got the following error
> on the console and on the jobtracker web UI page for those corresponding
> tasks.
>
> please what is the problem, help...
>
>
>
> java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
>    at
>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
>    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
>    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>    at org.apache.hadoop.mapred.Child.main(Child.java:170)
> Caused by: java.lang.reflect.InvocationTargetException
>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>    at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>    at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>    at
>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
>    ... 3 more
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/zookeeper/KeeperException
>    at SetImplementation.MyHset$setInsertionMapper.(MyHset.java:138)
>    ... 8 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.zookeeper.KeeperException
>    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>    ... 9 more
>
>
>
>
>
> On Tue, Dec 13, 2011 at 9:22 AM, Harsh J <ha...@cloudera.com> wrote:
>
> > Vamsi,
> >
> > One easy hack is to:
> > config.set("mapred.job.tracker", "jthost:port");
> >
> > (Or better yet, use the Tool interface always to write your Hadoop
> > jobs and then you can simply pass a "-jt <host:port>" in the
> > command-line when you want it to run against a cluster.
> >
> > On 13-Dec-2011, at 8:43 AM, Vamshi Krishna wrote:
> >
> > > what i shoud set in job's classpath ? where should i do setting
> > > class
> > path
> > > for job and how ? My requirement is to run the MR jobs on the
> > > cluster of nodes and NOT by LocalJobRunner , when i start the program
> form eclipse.
> > > please help me..
> > > My snippet of code for job settings is here, are there any more
> > > settings
> > i
> > > need to add here,
> > >
> > > public static void main(String args[]) throws IOException,
> > > InterruptedException, ClassNotFoundException
> > >    {
> > >
> > >        Configuration config=HBaseConfiguration.create();
> > >        Job job=new Job(config, "SET-Insetion");
> > >        job.setJarByClass(MyHset.class);
> > >        job.setMapperClass(setInsertionMapper.class);
> > >
> > >        ...
> > >        ...
> > >
> > > On Mon, Dec 12, 2011 at 11:35 PM, Jean-Daniel Cryans <
> > jdcry...@apache.org>wrote:
> > >
> > >> That setting also needs to be in your job's classpath, it won't
> > >> guess
> > it.
> > >>
> > >> J-D
> > >>
> > >> On Thu, Dec 8, 2011 at 10:14 PM, Vamshi Krishna
> > >> <vamshi2...@gmail.com>
> > >> wrote:
> > >>> Hi harsh,
> > >>> ya, i no jobs are seen in that jobtracker page, under RUNNING JOBS
> > >>> it
> > is
> > >>> none, under FINISHED JOBS it is none,FAILED JOBS it is none . its
> > >>> just
> > >> like
> > >>> no job is running. In eclipse i could see during mapreduce program
> > >> running,
> > >>> as you said "LOcalJobRunner", may be  Eclipse is merely launching
> > >>> the program via a LocalJobRunner.
> > >>> I ran like this,
> > >>>
> > >>> 1) right click on my main java file-> run as-> java application    ,
> > So,
> > >> it
> > >>> happened as i mentioned.
> > >>>
> > >>> So, i tried even doing this,
> > >>>
> > >>> 2) right click on my main java file-> run as-> Run on hadoop,    Now
> > >>> nothing is happening, i mean to say, no job is created, no process
> > seems
> > >> to
> > >>> be started then, i checked even the jobtracker, task tracker pages
> > also,
> > >>> there also i colud see no jobs are running, all are none.
> > >>>
> > >>> But actually if i see my mared-site.xml file in conf directory of
> > hadoop,
> > >>> its like this
> > >>>
> > >>> <name>mapred.job.tracker</name>
> > >>> <value>hadoop-namenode:9001</value>
> > >>>
> > >>> this hadoop-namenode's ip address is 10.0.1.54, i am running my
> > mapreduce
> > >>> job from an eclipse ,which is residing on the same machine. so,
> > >>> the mapred.job.tracker is set to one machine and port, so, then it
> > >>> should
> > be
> > >>> submitted as distributed job, right? But why this is not happening?
> > >>> On all machines , all daemons are running.
> > >>> what i should do to run it  on clusetr from the eclipse.. please
> >  help..
> > >>> On Thu, Dec 8, 2011 at 12:12 PM, Harsh J <ha...@cloudera.com> wrote:
> > >>>
> > >>>> Do you not see progress, or do you not see a job at all?
> > >>>>
> > >>>> Perhaps the problem is that your Eclipse is merely launching the
> > program
> > >>>> via a LocalJobRunner, and not submitting to the cluster. This is
> > >>>> cause
> > >> of
> > >>>> improper config setup (you need "mapred.job.tracker" set at
> > >>>> minimum,
> > to
> > >>>> submit a distributed job).
> > >>>>
> > >>>> On 08-Dec-2011, at 12:10 PM, Vamshi Krishna wrote:
> > >>>>
> > >>>>> Hi all,
> > >>>>> i am running hbase on 3 machine cluster.i am running a mapreduce
> > >> program
> > >>>> to
> > >>>>> insert data into an hbase table from elipse, so when its running
> > >>>>> i
> > >> opened
> > >>>>> hadoop jobtracker and tasktracker pages
> > >>>>> (http://10.0.1.54:50030 and http://10.0.1.54:50060) on browser,
> > >>>>> i
> > >> could
> > >>>>> find no changes or progress of mapreduce jobs, like map tasks'
> > >> progresses
> > >>>>> etc.. What is the problem, how can i see their progress on the
> > >> browser,
> > >>>>> while mapreduce program is running from eclipse? i am using
> > >> ubuntu-10.04
> > >>>>>
> > >>>>> can anybody help?
> > >>>>>
> > >>>>> --
> > >>>>> *Regards*
> > >>>>> *
> > >>>>> Vamshi Krishna
> > >>>>> *
> > >>>>
> > >>>>
> > >>>
> > >>>
> > >>> --
> > >>> *Regards*
> > >>> *
> > >>> Vamshi Krishna
> > >>> *
> > >>
> > >
> > >
> > >
> > > --
> > > *Regards*
> > > *
> > > Vamshi Krishna
> > > *
> >
> >
>
>
> --
> *Regards*
> *
> Vamshi Krishna
> *
>
> ::DISCLAIMER::
>
> -----------------------------------------------------------------------------------------------------------------------
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> It shall not attach any liability on the originator or HCL or its
> affiliates. Any views or opinions presented in
> this email are solely those of the author and may not necessarily reflect
> the opinions of HCL or its affiliates.
> Any form of reproduction, dissemination, copying, disclosure,
> modification, distribution and / or publication of
> this message without the prior written consent of the author of this
> e-mail is strictly prohibited. If you have
> received this email in error please delete it and notify the sender
> immediately. Before opening any mail and
> attachments please check them for viruses and defect.
>
>
> -----------------------------------------------------------------------------------------------------------------------
>



-- 
*Regards*
*
Vamshi Krishna
*

Reply via email to