> Should I copy the hadoop, hbase folders to the location /usr/local/src

nope, that's just a example. You'll need to set up the path of your
hadoop/hbase/jdk.

On Fri, Oct 30, 2009 at 1:19 PM,  <[email protected]> wrote:
> Hi,
>    I am sorry but my /usr/local/src/ folder is empty. I have stored the 
> folders
> for Hadoop and Hbase in the ~/ directory. Consequently my hama-site.xml file
> looks like this.
>
>
> # The java implementation to use.  Required.
> export JAVA_HOME=/usr/lib/jvm/java-6-sun
>
> # hadoop conf dir. to find the mapreduce cluster.
> export HADOOP_CONF_DIR=/home/hadoop/hadoop-0.19.1/conf
>
> # hbase conf dir. to find the hbase cluster.
> export HBASE_CONF_DIR=/home/hadoop/hbase-0.19.3/conf
>
> # Extra Java CLASSPATH elements.  Optional.
> export HAMA_CLASSPATH=$HADOOP_CONF_DIR:$HBASE_CONF_DIR
>
> # The maximum amount of heap to use, in MB. Default is 1000.
> export HBASE_HEAPSIZE=1000
>
> There is no other reason for making the variable HADOOP_CONF_DIR point to
> /usr/local/src/hadoop-0.19.0/conf other than the fact that your hadoop folder 
> is
> at the location /usr/local/src.
> Should I copy the hadoop, hbase folders to the location /usr/local/src
>
> Thank You
>
> Abhishek Agrawal
>
> SUNY- Buffalo
> (716-435-7122)
>
> On Thu 10/29/09  9:58 PM , "Edward J. Yoon" [email protected] sent:
>> > had...@zeus:~
>> /HAMA$ echo $CLASSPATH>
>> >
>> /home/hadoop/hadoop-0.19.1/hadoop-0.19.1-core.jar:/home/hadoop/hbase-0.19.3
>> /hbase-0.19.3.jar:/home/hadoop/hbase-0.19.3/lib/commons-logging-1.0.4.jar:/
>> home/hadoop/hbase-0.19.3/lib/log4j-1.2.13:/home/hadoop/weka.jar:/home/hadoo
>> p/HAMA/hama-0.1.0-dev.jar:/home/hadoop/HAMA/hama-0.1.0-dev-examples.jar:.
>> You don't need to directly set up classpath. Instead, please change
>> the conf/hama-env.sh as described below.
>>
>> ----
>> # Set environment variables here.
>>
>> # The java implementation to use.  Required.
>> export JAVA_HOME=/usr/lib/j2sdk1.5-sun
>>
>> # hadoop conf dir. to find the mapreduce cluster.
>> export HADOOP_CONF_DIR=/usr/local/src/hadoop-0.19.0/conf
>>
>> # hbase conf dir. to find the hbase cluster.
>> export HBASE_CONF_DIR=/usr/local/src/hbase-0.19.0/conf
>>
>> # Extra Java CLASSPATH elements.  Optional.
>> export HAMA_CLASSPATH=$HADOOP_CONF_DIR:$HBASE_CONF_DIR
>>
>> # The maximum amount of heap to use, in MB. Default is 1000.
>> export HBASE_HEAPSIZE=1000
>>
>> Thanks.
>>
>> On Fri, Oct 30, 2009 at 5:00 AM,  <aa...@buffa
>> lo.edu> wrote:> Hi all,
>> >       Okay I
>> made the changes suggested by you. Now another problem has come
>> up.> I execute the command
>> > bin/hama examples rand -m 10 -r 10 100 100 35%
>> matA and I get the following O/p>
>> > had...@zeus:~
>> /HAMA$ bin/hama examples rand -m 10 -r 10 100 100 35% matA>
>> > 03/04/21 19:51:51 INFO hama.AbstractMatrix:
>> Initializing the matrix storage.> 03/04/21 19:51:57 INFO hama.AbstractMatrix:
>> Create Matrix SparseMatrix_randjlmkz> 03/04/21 19:51:57 INFO 
>> hama.AbstractMatrix:
>> Create the 100 * 100 random matrix :> SparseMatrix_randjlmkz
>> > Wrote input for Map #0
>> > Wrote input for Map #1
>> > Wrote input for Map #2
>> > Wrote input for Map #3
>> > Wrote input for Map #4
>> > Wrote input for Map #5
>> > Wrote input for Map #6
>> > Wrote input for Map #7
>> > Wrote input for Map #8
>> > Wrote input for Map #9
>> > 03/04/21 19:51:58 WARN mapred.JobClient: Use
>> GenericOptionsParser for parsing the> arguments. Applications should 
>> implement Tool
>> for the same.> 03/04/21 19:51:58 WARN mapred.JobClient: Use
>> genericOptions for the option -libjars> java.io.FileNotFoundException: File
>> ~/HAMA/hama-0.1.0-dev.jar does not exist.>     Â
>>  at>
>> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.ja
>> va:420)>     Â
>>  at>
>> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:2
>> 44)>     Â
>>  at
>> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)>     Â
>>  at
>> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1187)>   
>>   Â
>>  at
>> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1163)>   
>>   Â
>>  at
>> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1135)>   
>>   Â
>>  at>
>> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.ja
>> va:693)>     Â
>>  at
>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:788)>     Â
>>  at
>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1142)>     Â
>>  at
>> org.apache.hama.SparseMatrix.random_mapred(SparseMatrix.java:160)>     Â
>>  at
>> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>     Â
>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method)>     Â
>>  at>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
>> 9)>     Â
>>  at>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
>> l.java:25)>     Â
>>  at java.lang.reflect.Method.invoke(Method.java:597)>     Â
>>  at>
>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
>> r.java:68)>     Â
>>  at
>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>     Â
>>  at
>> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>     Â
>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method)>     Â
>>  at>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
>> 9)>     Â
>>  at>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
>> l.java:25)>     Â
>>  at java.lang.reflect.Method.invoke(Method.java:597)>     Â
>>  at
>> org.apache.hadoop.util.RunJar.main(RunJar.java:165)>     Â
>>  at
>> org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)>     Â
>>  at
>> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)>     Â
>>  at
>> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)>     Â
>>  at
>> org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)>
>> > This is a snapshot of the ~/HAMA
>> folder>
>> > had...@zeus:~
>> /HAMA$ ls>
>> > bin  build  build.xml
>>  conf  hama-0.1.0-dev-examples.jar
>>  hama-0.1.0-dev.jar> hama-0.1.0-dev-test.jar  lib
>>  src  src-gen>
>> > This is the classpath that I have set
>> up>
>> > had...@zeus:~
>> /HAMA$ echo $CLASSPATH>
>> >
>> /home/hadoop/hadoop-0.19.1/hadoop-0.19.1-core.jar:/home/hadoop/hbase-0.19.3
>> /hbase-0.19.3.jar:/home/hadoop/hbase-0.19.3/lib/commons-logging-1.0.4.jar:/
>> home/hadoop/hbase-0.19.3/lib/log4j-1.2.13:/home/hadoop/weka.jar:/home/hadoo
>> p/HAMA/hama-0.1.0-dev.jar:/home/hadoop/HAMA/hama-0.1.0-dev-examples.jar:.>
>> > Now I cannot understand the reason why it cannot
>> find ~/HAMA/hama-0.1.0-dev.jar .> It is in the classpath as well in the 
>> concerned
>> folder.>
>> > Thank You
>> >
>> > Abhishek Agrawal
>> >
>> > SUNY- Buffalo
>> > (716-435-7122)
>> >
>> > On Wed 10/28/09  9:06 PM ,
>> "Edward J. Yoon" edwardy
>> [email protected] sent:>> >   Ok..I shall obtain the
>> source code from this>> link>
> http://svn.apache.org/viewvc/incubator/hama/branches/. This is the> correct 
> link>
> right ?
>> >>
>> >> Yes.
>> >>
>> >> > http://wiki.apache.org/hama/GettingStarted.
>> Probably>> a change should be made there.
>> >> Thanks, I'll update that page.
>> >>
>> >> On Thu, Oct 29, 2009 at 10:02 AM,
>>  <aa...@buffa>> lo.edu> wrote:> Hi,
>> >> > �  Ok..I shall
>> obtain the source code>> from this link>
> http://svn.apache.org/viewvc/incubator/hama/branches/. This is> the correct 
> link>
> right ?I read the HAMA
>> getting started guide on>> wikipedia.> 
>> http://wiki.apache.org/hama/GettingStarted.
>> Probably>> a change should be made
>> there.>>> > Thank You
>> >> >
>> >> > Abhishek Agrawal
>> >> >
>> >> > SUNY- Buffalo
>> >> > (716-435-7122)
>> >> >
>> >> > On Wed 10/28/09
>> � 8:44 PM ,>> "Edward J. Yoon"
>> edwardy>> [email protected]
>>  sent:>> Hi.>> >>
>> >> >> you should use the hama-0.19 branch
>> instead>> of trunk. or update hadoop>> &
>> hbase to 0.20.x version.>> >>
>> >> >> On Thu, Oct 29, 2009 at 8:05
>> AM,>> � <aa...@buffa&g
>> t;> lo.edu> wrote:> Hellow,>> >> >
>> ���>>
>> Ã��Ã�Â
>> ���  The>> version
>> of Hadoop used by me is>> hadoop-0.19.1 and the version of>>
>> hbase> used by me is hbase-0.19.3. I have>> put the files>> 
>> hadoop-0.19.1-core.jar
>> and>>> hbase-0.19.3.jar in the lib>> >
>> folder in hama.>> >> >
>> >> >> > I am getting the following
>> exception>> when I try>> to create a random
>> matrix. The> exception>> is first caught in>> >
>> HColumnDescriptor.java. I am attaching>> that file>> for your convenience. As
>> you can> see on>> line 197 the HColumnDescriptor>>
>> constructor is called. On line 201 this>>> constructor call a method>>
>> isLegalFamilyName(familyName). If name does>> not end> with : an exception 
>> is>
> being
>> thrown>> >> >
>> >> >> > Also where are the log files
>> stored>> ?>> >
>> >> >> > had...@zeus:~
>> >> >> /HAMA$ bin/hama examples rand -m 10
>> -r 10>> 2000 2000 30.5%>>
>> matrixA>>> >> >
>> java.lang.IllegalArgumentException:>> Family names>> must end in a colon:
>> path>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.isLegalFamilyName(HColumnDescript
>> >> >> or.java:236)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:201)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:157)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:146)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.HamaAdminImpl.initialJob(HamaAdminImpl.java:80)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.HamaAdminImpl.(HamaAdminImpl.java:70)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hama.matrix.AbstractMatrix.setConfiguration(AbstractMatrix.java:
>> >> >> 105)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.matrix.SparseMatrix.(SparseMatrix.java:60)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.matrix.SparseMatrix.random_mapred(SparseMatrix.java:120)>
>> ;>> ;
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>
>> >>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>> Method)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
>> >> >> 9)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
>> >> >> l.java:25)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>
>> java.lang.reflect.Method.invoke(Method.java:597)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
>> >> >> r.java:68)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>>>
> java.lang.IllegalArgumentException:
>> Family>> names>> must end in a colon:
>> column>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.isLegalFamilyName(HColumnDescript
>> >> >> or.java:236)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:201)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.hbase.HColumnDescriptor.(HColumnDescriptor.java:157)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.matrix.AbstractMatrix.create(AbstractMatrix.java:144)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hama.matrix.AbstractMatrix.tryToCreateTable(AbstractMatrix.java:
>> >> >> 122)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.matrix.SparseMatrix.(SparseMatrix.java:62)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.matrix.SparseMatrix.random_mapred(SparseMatrix.java:120)>
>> ;>> ;
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.examples.RandomMatrix.main(RandomMatrix.java:49)>
>> >>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>>
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native>> Method)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3
>> >> >> 9)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
>> >> >> l.java:25)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>
>> java.lang.reflect.Method.invoke(Method.java:597)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>>> >>
>> >>
>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDrive
>> >> >> r.java:68)>
>> ���>>
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)>>>
>> Ã��Ã�Â
>> Ã��Ã�Â
>> ��>>
>> ��� at>> >>
>> >>
>> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:34)>>&
>> gt; >>> >> >
>> >> >> >
>> >> >> > Abhishek Agrawal
>> >> >> >
>> >> >> > SUNY- Buffalo
>> >> >> > (716-435-7122)
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Best Regards, Edward J. Yoon @
>> NHN,>> corp.>> edwardy
>> >> >> [email protected]http://blog.udanax.org>>>> >>
>> >> >>
>> >> >
>> >> >
>> >>
>> >>
>> >>
>> >> --
>> >> Best Regards, Edward J. Yoon @ NHN,
>> corp.>> edwardy
>> >> [email protected]http://blog.udanax.org>>
>> >>
>> >>
>> >
>> >
>>
>>
>>
>> --
>> Best Regards, Edward J. Yoon @ NHN, corp.
>> edwardy
>> [email protected]http://blog.udanax.org
>>
>>
>>
>
>



-- 
Best Regards, Edward J. Yoon @ NHN, corp.
[email protected]
http://blog.udanax.org

Reply via email to