maybe you've missed to set up class paths normally.
check your path information to include all symbols.
Junyoung Kim ([email protected])
On 01/22/2011 02:08 AM, Edson Ramiro wrote:
Hi all,
I'm compiling hadoop from git using these instructions [1].
The hadoop-common and hadoop-hdfs are okay, they compile without erros, but
when I execute ant mvn-install to compile hadoop-mapreduce I get this error.
compile-mapred-test:
[javac] /home/lbd/hadoop/hadoop-ramiro/hadoop-mapreduce/build.xml:602:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 179 source files to
/home/lbd/hadoop/hadoop-ramiro/hadoop-mapreduce/build/test/mapred/classes
[javac]
/home/lbd/hadoop/hadoop-ramiro/hadoop-mapreduce/src/test/mapred/org/apache/hadoop/mapred/TestMRServerPorts.java:84:
cannot find symbol
[javac] symbol : variable NAME_NODE_HOST
[javac] TestHDFSServerPorts.NAME_NODE_HOST + "0");
[javac] ^
[javac]
/home/lbd/hadoop/hadoop-ramiro/hadoop-mapreduce/src/test/mapred/org/apache/hadoop/mapred/TestMRServerPorts.java:86:
cannot find symbol
[javac] symbol : variable NAME_NODE_HTTP_HOST
[javac] location: class org.apache.hadoop.hdfs.TestHDFSServerPorts
[javac] TestHDFSServerPorts.NAME_NODE_HTTP_HOST + "0");
[javac] ^
...
Is that a bug?
This is my build.properties
#this is essential
resolvers=internal
#you can increment this number as you see fit
version=0.22.0-alpha-1
project.version=${version}
hadoop.version=${version}
hadoop-core.version=${version}
hadoop-hdfs.version=${version}
hadoop-mapred.version=${version}
Other question, Is the 0.22.0-alpha-1 the latest version?
Thanks in advance,
[1] https://github.com/apache/hadoop-mapreduce
--
Edson Ramiro Lucas Filho
{skype, twitter, gtalk}: erlfilho
http://www.inf.ufpr.br/erlf07/