Hi,
I think the paths that I have set up are correct. In fact I went to the Hbase
home directory, invoked the shell and I typed the list command. This is the O/P
that I got.

hbase(main):001:0> list

SparseMatrix_randjlmkz
SparseMatrix_randljaul
SparseMatrix_randltgca
SparseMatrix_randmabpy
SparseMatrix_randmpcgl
SparseMatrix_randozrcg
SparseMatrix_randslptv
SparseMatrix_randugfev
SparseMatrix_randusion
SparseMatrix_randxsdbg
SparseMatrix_randzrgjs
13 row(s) in 0.1992 seconds
hbase(main):002:0>

Similarly I tried to see all the files in my HDFS database

had...@zeus:~/hadoop-0.19.1$ bin/hadoop dfs -ls

Found 4 items
drwxr-xr-x - hadoop supergroup 0 2003-04-21 17:45
/user/hadoop/SparseMatrix_TMP_dir
drwxr-xr-x - hadoop supergroup 0 2003-04-21 17:28
/user/hadoop/etexts-output
drwxr-xr-x - hadoop supergroup 0 2003-04-21 06:55 /user/hadoop/wordcount
drwxr-xr-x - hadoop supergroup 0 2003-04-21 17:27 /user/hadoop/wordcount-1
had...@zeus:~/hadoop-0.19.1$

So this makes me feel that the paths I have set in the hama-env.sh are correct.
What do you think ?
Also can you tell me where are the log directory is created ? 

Abhishek Agrawal

SUNY- Buffalo
(716-435-7122)

On Fri 10/30/09 12:43 AM , "Edward J. Yoon" [email protected] sent:
> > Should I copy the hadoop, hbase folders to the
> location /usr/local/src
> nope, that's just a example. You'll need to set up the path of your
> hadoop/hbase/jdk.
> 
> On Fri, Oct 30, 2009 at 1:19 PM,  <aa...@buffa
> lo.edu> wrote:> Hi,
> >    I am sorry but my
> /usr/local/src/ folder is empty. I have stored the folders> for Hadoop and
Hbase in the ~/ directory.
> Consequently my hama-site.xml file> looks like this.
> >
> >
> > # The java implementation to use.
>  Required.> export
> JAVA_HOME=/usr/lib/jvm/java-6-sun>
> > # hadoop conf dir. to find the mapreduce
> cluster.> export
> HADOOP_CONF_DIR=/home/hadoop/hadoop-0.19.1/conf>
> > # hbase conf dir. to find the hbase
> cluster.> export
> HBASE_CONF_DIR=/home/hadoop/hbase-0.19.3/conf>
> > # Extra Java CLASSPATH elements.
>  Optional.> export
> HAMA_CLASSPATH=$HADOOP_CONF_DIR:$HBASE_CONF_DIR>
> > # The maximum amount of heap to use, in MB.
> Default is 1000.> export HBASE_HEAPSIZE=1000
> >
> > There is no other reason for making the variable
> HADOOP_CONF_DIR point to> /usr/local/src/hadoop-0.19.0/conf other than the
> fact that your hadoop folder is> at the location /usr/local/src.
> > Should I copy the hadoop, hbase folders to the
> location /usr/local/src>
> > Thank You
> >
> > Abhishek Agrawal
> >
> > SUNY- Buffalo
> > (716-435-7122)
> >
> > On Thu 10/29/09  9:58 PM ,
> "Edward J. Yoon" edwardy
> [email protected] sent:>> > had...@zeus:~
> >> /HAMA$ echo $CLASSPATH>
> >> >
> >>
> /home/hadoop/hadoop-0.19.1/hadoop-0.19.1-core.jar:/home/hadoop/hbase-0.19.3
> >>
> /hbase-0.19.3.jar:/home/hadoop/hbase-0.19.3/lib/commons-logging-1.0.4.jar:/
> >>
> home/hadoop/hbase-0.19.3/lib/log4j-1.2.13:/home/hadoop/weka.jar:/home/hadoo
> >>
> p/HAMA/hama-0.1.0-dev.jar:/home/hadoop/HAMA/hama-0.1.0-dev-examples.jar:.>> 
> You
don't need to directly set up classpath.
> Instead, please change>> the conf/hama-env.sh as described
> below.>>
> >> ----
> >> # Set environment variables here.
> >>
> >> # The java implementation to use.
>  Required.>> export
> JAVA_HOME=/usr/lib/j2sdk1.5-sun>>
> >> # hadoop conf dir. to find the mapreduce
> cluster.>> export
> HADOOP_CONF_DIR=/usr/local/src/hadoop-0.19.0/conf>>
> >> # hbase conf dir. to find the hbase
> cluster.>> export
> HBASE_CONF_DIR=/usr/local/src/hbase-0.19.0/conf>>
> >> # Extra Java CLASSPATH elements.
>  Optional.>> export
> HAMA_CLASSPATH=$HADOOP_CONF_DIR:$HBASE_CONF_DIR>>
> >> # The maximum amount of heap to use, in MB.
> Default is 1000.>> export HBASE_HEAPSIZE=1000
> >>
> >> Thanks.
> >>
> >> On Fri, Oct 30, 2009 at 5:00 AM,
>  <aa...@buffa>> lo.edu> wrote:> Hi all,
> >> > � 
> �  �  Okay I>> made the changes suggested by you. Now
> another problem has come>> up.> I execute the command
> >> > bin/hama examples rand -m 10 -r 10 100
> 100 35%>> matA and I get the following
> O/p>>> > had...@zeus:~
> >> /HAMA$ bin/hama examples rand -m 10 -r 10
> 100 100 35% matA>>> > 03/04/21 19:51:51 INFO
> hama.AbstractMatrix:>> Initializing the matrix storage.>
> 03/04/21 19:51:57 INFO hama.AbstractMatrix:>> Create Matrix 
> SparseMatrix_randjlmkz>
> 03/04/21 19:51:57 INFO hama.AbstractMatrix:>> Create the 100 * 100 random 
> matrix :>
> SparseMatrix_randjlmkz>> > Wrote input for Map #0
> >> > Wrote input for Map #1
> >> > Wrote input for Map #2
> >> > Wrote input for Map #3
> >> > Wrote input for Map #4
> >> > Wrote input for Map #5
> >> > Wrote input for Map #6
> >> > Wrote input for Map #7
> >> > Wrote input for Map #8
> >> > Wrote input for Map #9
> >> > 03/04/21 19:51:58 WARN
> mapred.JobClient: Use>> GenericOptionsParser for parsing the>
> arguments. Applications should implement Tool>> for the same.> 03/04/21
19:51:58 WARN
> mapred.JobClient: Use>> genericOptions for the option -libjars>
> java.io.FileNotFoundException: File>> ~/HAMA/hama-0.1.0-dev.jar does not
> exist.> �  � 
> �>> � at>
> >>
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.ja
> >> va:420)> � 
> �  �>> � at>
> >>
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:2
> >> 44)> � 
> �  �>> � at
> >>
> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)>
> �  �  �>> � at
> >>
> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1187)>
> �  �  �>> � at
> >>
> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1163)>
> �  �  �>> � at
> >>
> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1135)>
> �  �  �>> � at>
> >>
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.ja
> >> va:693)> � 
> �  �>> � at
> >>
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:788)>
> �  �  �>> � at
> >>
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1142)>
> �  �  �>> � at
> >>
> org.apache.hama.SparseMatrix.random_mapred(SparseMatrix.java:160)>
> �  �  �>> � at
> >>
> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>
> �  �  �>> � at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>> Method)> � 
> �  �>> � at>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
> >> 9)> � 
> �  �>> � at>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> >> l.java:25)> � 
> �  �>> � at
> java.lang.reflect.Method.invoke(Method.java:597)> � 
> �  �>> � at>
> >>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
> >> r.java:68)> � 
> �  �>> � at
> >>
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>
> �  �  �>> � at
> >>
> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>
> �  �  �>> � at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>> Method)> � 
> �  �>> � at>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
> >> 9)> � 
> �  �>> � at>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> >> l.java:25)> � 
> �  �>> � at
> java.lang.reflect.Method.invoke(Method.java:597)> � 
> �  �>> � at
> >>
> org.apache.hadoop.util.RunJar.main(RunJar.java:165)>
> �  �  �>> � at
> >>
> org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)>
> �  �  �>> � at
> >>
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)>
> �  �  �>> � at
> >>
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)>
> �  �  �>> � at
> >>
> org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)>>> > This is a
snapshot of the
> ~/HAMA>> folder>
> >> > had...@zeus:~
> >> /HAMA$ ls>
> >> > bin � build
> � build.xml>> � conf
> � hama-0.1.0-dev-examples.jar>> � hama-0.1.0-dev.jar>
> hama-0.1.0-dev-test.jar � lib>> � src
> � src-gen>>> > This is the classpath that I have
> set>> up>
> >> > had...@zeus:~
> >> /HAMA$ echo $CLASSPATH>
> >> >
> >>
> /home/hadoop/hadoop-0.19.1/hadoop-0.19.1-core.jar:/home/hadoop/hbase-0.19.3
> >>
> /hbase-0.19.3.jar:/home/hadoop/hbase-0.19.3/lib/commons-logging-1.0.4.jar:/
> >>
> home/hadoop/hbase-0.19.3/lib/log4j-1.2.13:/home/hadoop/weka.jar:/home/hadoo
> >>
> p/HAMA/hama-0.1.0-dev.jar:/home/hadoop/HAMA/hama-0.1.0-dev-examples.jar:.&g
> t;>> > Now I cannot understand the reason why
> it cannot>> find ~/HAMA/hama-0.1.0-dev.jar .> It is
> in the classpath as well in the concerned>> folder.>
> >> > Thank You
> >> >
> >> > Abhishek Agrawal
> >> >
> >> > SUNY- Buffalo
> >> > (716-435-7122)
> >> >
> >> > On Wed 10/28/09
> � 9:06 PM ,>> "Edward J. Yoon"
> edwardy>> [email protected]
>  sent:>> > �  Ok..I shall obtain
> the>> source code from this>>
> link>> http://svn.apache.org/viewvc/incubator/hama/branches/. This is the>
correct link>> right ?
> >> >>
> >> >> Yes.
> >> >>
> >> >> > http://wiki.apache.org/hama/GettingStarted.>> Probably>> a change
should be made
> there.>> >> Thanks, I'll update that
> page.>> >>
> >> >> On Thu, Oct 29, 2009 at 10:02
> AM,>> � <aa...@buffa&g
> t;> lo.edu> wrote:> Hi,>> >> >
> ���  Ok..I
> shall>> obtain the source code>> from this
> link>> http://svn.apache.org/viewvc/incubator/hama/branches/. This is> the
correct link>> right ?I read the HAMA
> >> getting started guide on>>
> wikipedia.> http://wiki.apache.org/hama/GettingStarted.>> Probably>> a change
should be
> made>> there.>>> > Thank You
> >> >> >
> >> >> > Abhishek Agrawal
> >> >> >
> >> >> > SUNY- Buffalo
> >> >> > (716-435-7122)
> >> >> >
> >> >> > On Wed 10/28/09
> >>
> ��� 8:44 PM ,>>
> "Edward J. Yoon">> edwardy>> [email protected]
> >>  sent:>> Hi.>>
> >>>> >> >> you should use the
> hama-0.19 branch>> instead>> of trunk. or update
> hadoop>> &>> hbase to 0.20.x version.>>
> >>>> >> >> On Thu, Oct 29, 2009 at
> 8:05>> AM,>>
> ��� <aa...@buffa&a
> mp;g>> t;> lo.edu> wrote:> Hellow,>>
> >> >>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> ���  The>>
> version>> of Hadoop used by me is>>
> hadoop-0.19.1 and the version of>>>> hbase> used by me is hbase-0.19.3. I
> have>> put the files>> hadoop-0.19.1-core.jar>> and>>> hbase-0.19.3.jar in the
> lib>> >>> folder in hama.>> >>
> >>> >> >> > I am getting the
> following>> exception>> when I try>> to
> create a random>> matrix. The> exception>> is first
> caught in>> >>> HColumnDescriptor.java. I am
> attaching>> that file>> for your convenience. As>> you can> see on>> line 197 
> the
> HColumnDescriptor>>>> constructor is called. On line 201
> this>>> constructor call a method>>>> isLegalFamilyName(familyName). If name
> does>> not end> with : an exception is>> being
> >> thrown>> >> >
> >> >> >> > Also where are the
> log files>> stored>> ?>> >
> >> >> >> > had...@zeus:~
> >> >> >> /HAMA$ bin/hama examples
> rand -m 10>> -r 10>> 2000 2000
> 30.5%>>>> matrixA>>> >> >
> >> java.lang.IllegalArgumentException:>>
> Family names>> must end in a colon:>> path>>>
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.isLegalFamilyName(HColumnDescript
> >> >> >>
> or.java:236)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:201)>&
> gt;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:157)>&
> gt;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:146)>&
> gt;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.HamaAdminImpl.initialJob(HamaAdminImpl.java:80)>>>
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.HamaAdminImpl.(HamaAdminImpl.java:70)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hama.matrix.AbstractMatrix.setConfiguration(AbstractMatrix.java:
> >> >> >> 105)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.matrix.SparseMatrix.(SparseMatrix.java:60)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.matrix.SparseMatrix.random_mapred(SparseMatrix.java:120)>
> ;>> ;>> ;
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>
> >> >>
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>>
> Method)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
> >> >> >> 9)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> >> >> >> l.java:25)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>>
> java.lang.reflect.Method.invoke(Method.java:597)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
> >> >> >> r.java:68)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>>
> >>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>>&
> gt;> java.lang.IllegalArgumentException:
> >> Family>> names>> must end in a
> colon:>> column>>>
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.isLegalFamilyName(HColumnDescript
> >> >> >>
> or.java:236)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:201)>&
> gt;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:157)>&
> gt;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.matrix.AbstractMatrix.create(AbstractMatrix.java:144)>&g
> t;>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hama.matrix.AbstractMatrix.tryToCreateTable(AbstractMatrix.java:
> >> >> >> 122)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.matrix.SparseMatrix.(SparseMatrix.java:62)>>>>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.matrix.SparseMatrix.random_mapred(SparseMatrix.java:120)>
> ;>> ;>> ;
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>
> >> >>
> >>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>>
> Method)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
> >> >> >> 9)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> >> >> >> l.java:25)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>>
> java.lang.reflect.Method.invoke(Method.java:597)>>>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>>
> >>>> >>
> >>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
> >> >> >> r.java:68)>
> >>
> ���
> ��>>>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>>
> >>>
> ���
> ���>>
> ���
> ���>>
> ���
> >>>>
> ���
> ��� at>>
> >>>> >>
> >>
> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>>&
> amp;>> gt; >>> >> >
> >> >> >> >
> >> >> >> > Abhishek
> Agrawal>> >> >> >
> >> >> >> > SUNY-
> Buffalo>> >> >> >
> (716-435-7122)>> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> --
> >> >> >> Best Regards, Edward J.
> Yoon @>> NHN,>> corp.>>
> edwardy>> >> >> [email protected]http://blog.udanax.org>>>>
> >>>> >> >>
> >> >> >
> >> >> >
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Best Regards, Edward J. Yoon @
> NHN,>> corp.>> edwardy
> >> >> [email protected]http://blog.udanax.org>>>> >>
> >> >>
> >> >
> >> >
> >>
> >>
> >>
> >> --
> >> Best Regards, Edward J. Yoon @ NHN,
> corp.>> edwardy
> >> [email protected]http://blog.udanax.org>>
> >>
> >>
> >
> >
> 
> 
> 
> -- 
> Best Regards, Edward J. Yoon @ NHN, corp.
> edwardy
> [email protected]http://blog.udanax.org
> 
> 
> 

Reply via email to