Hi,
The previous problem got solved but now I have a new problem. The O/P just
hangs at the statement map 0% reduce 0%. And then after
an eternity there is a null pointer exception. I do not understand what is the
problem. I checked the logs of hadoop and hbase. There is no log file created in
hbase and the log file in hadoop is just an xml file. I do not understand what
the problem is. Why are all these problems not coming when I run your example?
03/05/01 07:57:43 WARN mapred.JobClient: Use GenericOptionsParser for parsing
the
arguments. Applications should implement Tool for the same.
03/05/01 07:57:43 INFO mapred.HTableInputFormatBase: split: 0->zeus:,
03/05/01 07:57:44 INFO mapred.JobClient: Running job: job_200305010723_0008
03/05/01 07:57:45 INFO mapred.JobClient: map 0% reduce 0%
03/05/01 08:06:59 INFO mapred.JobClient: Task Id :
attempt_200305010723_0009_m_000000_0, Status : FAILED
java.lang.NullPointerException
at
org.apache.hama.mapred.HTableRecordReaderBase.restart(HTableRecordReaderBase.java:62)
at
org.apache.hama.mapred.HTableRecordReaderBase.init(HTableRecordReaderBase.java:73)
at
org.apache.hama.mapred.VectorInputFormat$TableRecordReader.init(VectorInputFormat.java:59)
at
org.apache.hama.mapred.VectorInputFormat.getRecordReader(VectorInputFormat.java:155)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331)
at org.apache.hadoop.mapred.Child.main(Child.java:158)
Thank You
Abhishek Agrawal
SUNY- Buffalo
(716-435-7122)
On Sat 11/07/09 6:03 AM , "Edward J. Yoon" [email protected] sent:
> Hi,
>
> One very simple way is the copy jars (hbase, hama) to
> {$HADOOP_HOME}/lib folder of all machines and re-boot hadoop cluster.
>
> On Sat, Nov 7, 2009 at 11:49 AM, <aa...@buffa
> lo.edu> wrote:> Hey,
> > Â Â The above mentioned
> error has got solved by adding the files hbase-0.19.3.jar> to the export
classpath in
> $HADOOP_HOME/conf/hadoop-env.sh. But now I am getting> a new bug. This time
> the
program cannot find
> files of hama-0.1.0-dev.jar. Inspite> of the fact that I have added it to the
> hadoop-env.sh. Do I need to do some thing> else as well ? Any other ideas ?
> >
> > Thank You
> >
> > Abhishek Agrawal
> >
> > SUNY- Buffalo
> > (716-435-7122)
> >
> > On Fri 11/06/09 Â 8:30 PM , aa...@buffa
> lo.edu sent:>> Hi,
> >> I have been working for last 2-3 days trying
> to solve the bugs that I>> hadmentioned previously. I solved some of
> them. My job is no longer running>> locallyand I am no more getting the java
> heap space exception. But now I am>> gettinganother exception. The full stack
> trace is as under. Once again this>> exceptiondoes not occur when I run the
> matrix multiplication by using bin/hbase>> multcommand. Appreciate any
> thoughts
in this
> matter>>
> >> 03/04/30 02:34:06 WARN mapred.JobClient: Use
> GenericOptionsParser for>> parsing thearguments. Applications should
> implement Tool for the same.>> 03/04/30 02:34:07 INFO
> mapred.HTableInputFormatBase: split:>> 0->aphrodite:,03/04/30 02:34:07 INFO
> mapred.JobClient: Running job:>> job_200304291805_000403/04/30 02:34:08 INFO
> mapred.JobClient: Â map 0% reduce 0%>> 03/04/30 02:34:14 INFO
> mapred.JobClient:
> Task Id :>> attempt_200304291805_0004_m_000000_0, Status
> : FAILED>> java.io.IOException: Split class
> org.apache.hadoop.hbase.mapred.TableSplit>> not foundat
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:314)>> at
> org.apache.hadoop.mapred.Child.main(Child.java:158)>> Caused by:
> java.lang.ClassNotFoundException:>>
> org.apache.hadoop.hbase.mapred.TableSplit>> at
> java.net.URLClassLoader$1.run(URLClassLoader.java:200)>> at
> java.security.AccessController.doPrivileged(Native Method)>> at
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)>> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)>> at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)>> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)>> at
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)>> at
java.lang.Class.forName0(Native
> Method)>> at
> java.lang.Class.forName(Class.java:247)>> at
> >>
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:673)
> >> at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:311)>> ... 1 more
> >>
> >>
> >> Thank You
> >>
> >> Abhishek Agrawal
> >>
> >> SUNY- Buffalo
> >> (716-435-7122)
> >>
> >> On Wed 11/04/09 Â 3:17 AM ,
> "Edward J. Yoon" edwardy>> [email protected]
> sent:> >> Caused by:>>
> java.lang.ClassNotFoundException:>>> org.apache.commons.httpclient.HttpMethod>
> Firstly, you have a> ClassNotFoundException.
> >> Please add dependencies to> your project.
> I would recommend to add all of>> {$HAMA_HOME/lib}.>
> >> > > 03/04/27 08:02:57 INFO
> >> mapred.MapTask:> io.sort.mb = 100>
> 03/04/27 08:02:57 WARN>> mapred.LocalJobRunner:>
> job_local_0001> java.lang.OutOfMemoryError:>> Java heap> space
> >> > Regarding OutOfMemoryError, It's hard
> to tell.>> Do you run the program> via
> eclipse?>> >
> >> > On Wed, Nov 4, 2009 at 3:59 PM,
> Â <aa...@buffa&g
> t; lo.edu> wrote:> Hey,>> > >
> ���>>
> ��� In my program I am>
> creating 2 matrices, putting values in them and>> then I> call the function>
> >>> > > DenseMatrix
> resultMatrix=>> >
> covMatrix.mult(retMatrix);>>> > > I am not setting the number of
> maps/>> reduces to> be done, the block thing.
> Is> there any>> problem because of that ?Also the full>
> error I am getting is as under.> I noticed>> these 2 sentences
> mapred.MapTask:>
> io.sort.mb = 100 and> mapred.LocalJobRunner:>> job_local_0001. Can you> tell
what is the
> significance of> these 2>> sentences ?> >
> >> > > 03/04/27 08:02:56 INFO
> >> jvm.JvmMetrics:> Initializing JVM Metrics
> with>>> processName=JobTracker, sessionId=> >
> 03/04/27 08:02:56 WARN mapred.JobClient:>> Use> GenericOptionsParser for
> parsing
> the>>> arguments. Applications should implement
> Tool> for the same.> 03/04/27 08:02:56> WARN
> >> mapred.JobClient: No job> jar file set.
> ��� User
> classes>> may> not be found. See JobConf(Class)
> or> JobConf#setJar(String).> 03/04/27> 08:02:56
> >> INFO> mapred.HTableInputFormatBase:
> split:>> 0->aphrodite:,> 03/04/27 08:02:56
> INFOmapred.JobClient: Running>> > job: job_local_0001> 03/04/27
> 08:02:57>> INFO> mapred.HTableInputFormatBase:
> split:>> 0->aphrodite:,> 03/04/27 08:02:57
> INFOmapred.MapTask:>> > numReduceTasks: 1> 03/04/27 08:02:57
> INFO>> mapred.MapTask:> io.sort.mb = 100>
> 03/04/27 08:02:57 WARN>> mapred.LocalJobRunner:>
> job_local_0001> java.lang.OutOfMemoryError:>> Java heap> space>
> ���>>
> Ã��Ã�Â
> ��� >
> ��� at>> >
> >>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.(MapTask.java:498)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:138)>
> ;>> ;Exception in thread
> "Thread-14">> >
> java.lang.NoClassDefFoundError:>>> >
> org/apache/commons/httpclient/HttpMethod>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:196)>
> ;>> ;Caused by:
> java.lang.ClassNotFoundException:>> >
> org.apache.commons.httpclient.HttpMethod>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> java.net.URLClassLoader$1.run(URLClassLoader.java:200)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>>
> java.security.AccessController.doPrivileged(Native> Method)>
> ���>>
> Ã��Ã�Â
> ��� >
> ��� at>> >
> >>
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� ... 1 more>
> Exception>> in thread "main">
> java.io.IOException: Job failed!>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1232)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hama.util.JobManager.execute(JobManager.java:52)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> org.apache.hama.DenseMatrix.mult(DenseMatrix.java:554)>>>
> Ã��Ã�Â
> ���>>
> ��� >
> ��� at>> >
> >>
> Markowitz.MarkowitzMain.main(MarkowitzMain.java:33)> had...@zeus:~
> >> > /HAMA$>
> >> > > Thank You
> >> > >
> >> > > Abhishek Agrawal
> >> > >
> >> > > SUNY- Buffalo
> >> > > (716-435-7122)
> >> > >
> >> > > On Wed 11/04/09 12:19 AM ,
> "Edward>> J.> Yoon" edwardy
> >> > [email protected]
> >> Â sent:>> I don't see any
> error during perform matrix> multiplication of a> matrix>> with
> >> itself.> >>
> >> > >> How large your matrix is?
> (and, let me>> know> your cluster
> information)>>>> > >> On Wed, Nov 4, 2009 at 11:40
> AM, Edward>> J.> Yoon
> ��� [email protected]
> >> > > wrote:> OK, That error appears
> *only>> when* perform> the>> matrix
> multiplication> of a>> matrix with> itself?>>
> >>> > >> > I'll check it.
> >> > >> >
> >> > >> > On Wed, Nov 4, 2009
> at>> 5:38> AM,>>
> >>
> ���
> ��� lo.edu>>
> wrote:>> Hi,>> >>>>
> ���
> ��� As I was> telling
> you yesterday>> I am getting an>> error when I call the> matrix>>
> multiplication function from>> my> program. The> >> exact error
> I am getting is>> as>>> under. The call to the
> function>>>> is>>> >> >>
> >>
> ���
> ��� >>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� >>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� DenseMatrix>
> resultMatrix=>>>> >
> covMatrix.mult(covMatrix);>>>>>> >> I am multiplying the matrix>
> covMatrix by>> itself just to check if>> the>>> configuration is done
> correctly. But then>> I>> am getting an exception. I tried
> to>>> run the default matrix multiplication>>>> program bin/hama examples mult
matA>
> matA>> but that worked perfectly well. SO>> I> think> >> its not that the
> heap is full>> rather> I>> have not done some
> setting>>>> properly.>>> >> >> had...@zeus:~
> >> > >> /HAMA$ java
> >> Markowitz.MarkowitzMain>>> In
> matrix Multiplication>> >>>> 03/04/26 20:29:05 INFO>
> jvm.JvmMetrics:>> Initializing JVM Metrics>> with>>>
> processName=JobTracker,>>>> sessionId=>> 03/04/26 20:29:05
> WARN> mapred.JobClient: Use>>>> GenericOptionsParser for parsing
> the>>> arguments. Applications should implement>>>> Tool for the same.>>
> 03/04/26
> 20:29:05> WARN mapred.JobClient: No>> job jar file>> set.
> ���
> ��� User> classes
> may>> not be found. See>> JobConf(Class) or>>
> JobConf#setJar(String).>>03/04/26>> > 20:29:05 INFO>>
> >> mapred.HTableInputFormatBase: split:>
> 0->aphrodite:,>> 03/04/26 20:29:05>> INFO> mapred.JobClient:> >>
> Running job: job_local_0001>>>> 03/04/26> 20:29:06
> INFO>>>> mapred.HTableInputFormatBase: split:>
> 0->aphrodite:,>> 03/04/26 20:29:06>> INFO> mapred.MapTask:> >>
> numReduceTasks: 1>> 03/04/26>> 20:29:06> INFO mapred.MapTask:>>
> io.sort.mb =>> 100>> 03/04/26 20:29:06>
> WARN>> mapred.LocalJobRunner:>> > job_local_0001>>
> >> java.lang.OutOfMemoryError: Java
> heap>> space>>>>
> ���
> ��� >>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.(MapTask.java:498)>>
> >> >
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:138)>
> ;>> ;> ;>> Exception in
> thread>> >
> "Thread-14">>>> >
> >>
> java.lang.NoClassDefFoundError:>>>>>
> org/apache/commons/httpclient/HttpMethod>>>>
> ���
> ��� >>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:196)>
> ;>> ;> ;>> Caused by:
> >> >
> >>
> java.lang.ClassNotFoundException:>>>
> org.apache.commons.httpclient.HttpMethod>>>>
> ���
> ��� >>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> java.net.URLClassLoader$1.run(URLClassLoader.java:200)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> java.security.AccessController.doPrivileged(Native>>
> Method)>>>>
> ���
> ��� >>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� ... 1>> more>>> Exception in thread
> "main">>>> java.io.IOException: Job
> failed!>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1232)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hama.util.JobManager.execute(JobManager.java:52)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> org.apache.hama.DenseMatrix.mult(DenseMatrix.java:554)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>
> >>>> >
> >>
> Markowitz.MarkowitzMain.main(MarkowitzMain.java:32)>> had...@zeus:~
> >> > >> /HAMA$>>
> >> > >> >> Thank You
> >> > >> >>
> >> > >> >>
> >> > >> >> Abhishek
> Agrawal>> > >> >>
> >> > >> >> SUNY-
> Buffalo>> > >> >>
> (716-435-7122)>> > >> >>
> >> > >> >> On Tue
> 11/03/09>> >
> >>
> ���
> ��� 2:58 AM
> ,>>>> "Edward J. Yoon">
> edwardy>> [email protected]
> >> > Â sent:>>>
> Hi,>>>> >>>> >> >>> # The
> maximum amount of>> heap to> use, in>> MB. Default is
> 1000.>>>>> # export>
> HBASE_HEAPSIZE=1000>>>> >>>> >> >>> Check
> above option>> and> regarding>> temporary tables,
> there is>> no> method>>> yet. But, You can
> delete them>> using>> HbaseShell's disable and
> delete>>>> command If you have no important> data in>>
> >> HDFS, re-format HDFS is good>>>>
> idea. It's just simple tip.>>>> >>>> >> >>> On
> Tue, Nov 3, 2009 at>> 4:39> PM,>>
> >>
> ���
> ��� lo.edu>>> wrote:>> Hi,>>>
> >>>
> ���
> �¯�¿�½>>
> ���
> ��� > I have>>
> another doubt. I am>> running>>> a> program of mine
> which results in>>>>> creation of quite a few tables in
> Hbase(397> to>>> be exact). Now when I am>>> running a>> >> matrix-matrix
> multiplication job I>> am> getting a>>> heap size not
> enough>>> exception.> What> >> should I do
> to delete these>> 397> table>>> easily
> ?>>>>> >>> > Thank You> >>
> >>> >>> > >> >>> >
> Abhishek>> > Agrawal>> >>>
> >>> > >> >>> >
> SUNY->> Buffalo> >> >>>
> >>> (716-435-7122)> >> >>>
> >>> > >> >>> > On
> Sat>> 10/31/09> >>
> >> >
> >>
> ���
> �¯�¿�½>>
> ���
> ��� 3:25 AM>> ,>>>> "Edward J.
> Yoon">>>> edwardy>>> [email protected]
> >> > >>
> ��� sent:>>>> Sounds good.> Thank you, I'll
> update>>>>> wiki.>>>> >>> >>
> (you can also change> the wiki>> pages, please>>>
> feel>> free> to>>
> contribute!)>>>>>>> >> On> Sat, Oct 31, 2009 at
> 2:28>>>> PM,>>>>
> >>
> ���
> �¯�¿�½>>
> ���
> ��� >>
> lo.edu>>> wrote:> Hey,>>>>> >>> >>>
> >> >
> >>
> ���
> �¯�¿�½>>
> Ã��Ã�¯Ã�ï¿
> ;½Ã�¿Ã��Ã�&A
> circ;½>>>
> ���
> �¯�¿�½>> >>>>>>
> >>
> ���
> �¯�¿�½>>
> Ã��Ã�¯Ã�ï¿
> ;½Ã�¿Ã��Ã�&A
> circ;½>>>
> ���
> �¯�¿�½>>
> ���
> ��� I just ran>> a> test.>> I>> put a println
> statement>> and it> worked.>>> Thanks a lot
> for> your>> help. From>> my> experience, I
> feel> addition>>> >> of the following>> statements to>> the>
> wiki>>> page would make>> it> easier for new users>>
> to>> install/>> get started> with>>>
> hama.>>> > 1.>> Absolute paths>> have to be given
> in>>> the> config>> file> 2. To see>>
> the>> O/P, changes have to>>> be made>
> to>> the source code, the>> code has>> to be>>>> compiled> again and
> the necessary>> files> copied>> to>>> >>>
> $HAMA_HOME>>> >>> >>>> > Thank> You>> >>>
> >>>> >> >> >>> >>
> >>> > Abhishek>> Agrawal>>>
> >>>> >> >> >>> >>
> >>> SUNY-> Buffalo>> >>>
> >>>> >> (716-435-7122)>> >>>
> >>>> >> >> >>>
> >>>> > >> >>> >>
> -->> > >> >>> >> Best
> Regards,>> Edward> J. Yoon @>>
> NHN,>>>>> corp.>>> edwardy>>
> >>> >> [email protected]http://blog.udanax.org>>>>>
> >> > >>>>
> >>>>> >>> >> >>>
> >>> > >> >>> >
> >> > >> >>>
> >> > >> >>>
> >> > >> >>>
> >> > >> >>> --
> >> > >> >>> Best Regards,
> Edward J.>> Yoon @> NHN,>>
> corp.>>>>> edwardy> >> >>> [email protected]http://blog.udanax.org>>>>>
> >> >>>> >>
> >>>>> > >> >>
> >> > >> >>
> >> > >> >
> >> > >> >
> >> > >> >
> >> > >> > --
> >> > >> > Best Regards, Edward J.
> Yoon>> @> NHN,>> corp.>
> edwardy>> > >> [email protected]
> >> > > http://blog.udanax.org>>>>
> >>>> > >>
> >> > >> --
> >> > >> Best Regards, Edward J. Yoon
> @>> NHN,> corp.>> edwardy
> >> > >> [email protected]http://blog.udanax.org>>>
> >>>> > >>
> >> > >
> >> > >
> >> >
> >> >
> >> >
> >> > --
> >> > Best Regards, Edward J. Yoon @
> NHN,>> corp.> edwardy
> >> > [email protected]http://blog.udanax.org>>> >
> >> >
> >>
> >>
> >>
> >
> >
>
>
>
> --
> Best Regards, Edward J. Yoon @ NHN, corp.
> edwardy
> [email protected]http://blog.udanax.org
>
>
>