OK do the following direct on mac osx mavericks 10.9.2 (i.e. without VMWare)
- I svn checked out hadoop-trunk - installed xcode then installed protocol buffer - from hadood=trunk I invoked [mvn clean install] - got the following ... ... Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.178 sec - in org.apache.hadoop.util.TestZKUtil Results : Failed tests: TestZKFailoverController.testGracefulFailoverFailBecomingActive:484 Did not fail to graceful failover when target failed to become active! TestZKFailoverController.testGracefulFailoverFailBecomingStandby:518 expected:<1> but was:<0> TestZKFailoverController.testGracefulFailoverFailBecomingStandbyAndFailFence:540 Failover should have failed when old node wont fence TestNetUtils.testNormalizeHostName:619 expected:<[81.200.64.50]> but was:<[UnknownHost123]> Tests in error: TestZKFailoverController.testGracefulFailover:444->Object.wait:-2 » test time... Tests run: 2339, Failures: 4, Errors: 1, Skipped: 106 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [8.624s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [4.589s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [7.138s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.876s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.362s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [13.112s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2:14.364s] [INFO] Apache Hadoop Auth ................................ SUCCESS [1:39.711s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.697s] [INFO] Apache Hadoop Common .............................. FAILURE [19:25.336s] [INFO] Apache Hadoop NFS ................................. SKIPPED [INFO] Apache Hadoop Common Project ...................... SKIPPED [INFO] Apache Hadoop HDFS ................................ SKIPPED ... ... ... Steve Loughran mentioned something about hosts file entries, I'm only using one machine and this is my hosts file content: ## # Host Database # # localhost is used to configure the loopback interface # when the system is booting. Do not change this entry. ## 127.0.0.1 localhost 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost # BEGIN section for OpenVPN Client SSL sites 127.94.0.1 client.openvpn.net 127.94.0.3 openvpn-client.mtl-he-lab2.exitcertified.com 127.94.0.2 openvpn-client.mtl-he-lab3.exitcertified.com # END section for OpenVPN Client SSL sites Do I need to create an entry for my public IP address? Any comments will be appreciated, thanks. On 17 March 2014 14:39, Omar@Gmail <omarnet...@googlemail.com> wrote: > Thanks will check that. > > > On 17 March 2014 13:29, Steve Loughran <ste...@hortonworks.com> wrote: > >> sounds like your network is not consistent with hadoop's expections. VMs >> are particularly fun here, while ubuntu's attempts to hide the truth hoops >> your host up to 127.0.1.1 if you are not careful >> >> make sure that there are hosts entries for all the machines so that they >> know their own names, it matches their IP addresses, etc. >> >> >> On 16 March 2014 00:49, Omar@Gmail <omarnet...@googlemail.com> wrote: >> >> > I'm following instructions from >> > http://wiki.apache.org/hadoop/HowToContribute >> > >> > I've checked out hadoop project using: >> > >> > svn checkout >> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-trunk >> > >> > When trying to build from root (i.e. hadoop-trunk) >> > >> > I get the errors I have mentioned before. >> > >> > Also I'm using mac os x and setting up hadoop development environment on >> > Ubuntu running on vmware. >> > >> > Are there any recommendations I can get from you regarding this. >> > >> > Should expect to get a successful build? >> > >> > >> > >> > On 15 March 2014 20:45, Ted Yu <yuzhih...@gmail.com> wrote: >> > >> > > From https://builds.apache.org/job/Hadoop-trunk/694/console : >> > > >> > > Build timed out (after 200 minutes). Marking the build as aborted. >> > > Build was aborted >> > > >> > > >> > > The above shows how long building all sub-projects of Hadoop might >> take. >> > > >> > > >> > > ------- >> > > >> > > https://builds.apache.org/job/Hadoop-Yarn-trunk builds >> > > hadoop-yarn-project under trunk. >> > > >> > > See below: >> > > >> > > >> > > cd trunk/ >> > > $MAVEN_HOME/bin/mvn clean install -DskipTests >> > > cd hadoop-yarn-project >> > > >> > > $MAVEN_HOME/bin/mvn clean verify checkstyle:checkstyle >> > > findbugs:findbugs -Pdist -Pnative -Dtar >> > > >> > > >> > > >> > > >> > > >> > > On Sat, Mar 15, 2014 at 12:03 PM, Omar@Gmail < >> omarnet...@googlemail.com >> > > >wrote: >> > > >> > > > Getting >> > > > >> > > > Results : >> > > > >> > > > Failed tests: >> > > > TestNetUtils.testNormalizeHostName:619 expected:<[81.200.64.50]> >> but >> > > > was:<[UnknownHost123]> >> > > > >> TestZKFailoverController.testGracefulFailoverFailBecomingActive:484 >> > Did >> > > > not fail to graceful failover when target failed to become active! >> > > > >> TestZKFailoverController.testGracefulFailoverFailBecomingStandby:518 >> > > > expected:<1> but was:<0> >> > > > >> > > > >> > > > >> > > >> > >> TestZKFailoverController.testGracefulFailoverFailBecomingStandbyAndFailFence:540 >> > > > Failover should have failed when old node wont fence >> > > > TestMetricsSystemImpl.testMultiThreadedPublish:232 expected:<0> >> but >> > > > was:<2> >> > > > >> > > > Tests in error: >> > > > TestZKFailoverController.testGracefulFailover:432 » test timed >> out >> > > after >> > > > 2500... >> > > > >> > > > [ERROR] Failed to execute goal >> > > > org.apache.maven.plugins:maven-surefire-plugin:2.16:test >> (default-test) >> > > on >> > > > project hadoop-common: There are test failures. >> > > > >> > > > >> > > > Is this the correct build project >> > > > https://builds.apache.org/job/Hadoop-trunk >> > > > >> > > > Also not sure why you suggested these as they don't seem to fall >> under >> > > > hadoop-trunk as I would expect them to. >> > > > https://builds.apache.org/job/Hadoop-Yarn-trunk >> > > > https://builds.apache.org/job/Hadoop-hdfs-trunk/ >> > > > >> > > > Thanks >> > > > >> > > > >> > > > >> > > > On 15 March 2014 18:20, Omar@Gmail <omarnet...@googlemail.com> >> wrote: >> > > > >> > > > > >> > > > > I just took another svn update and building again will email which >> > > module >> > > > > is failing for me. >> > > > > >> > > > > Thanks >> > > > > >> > > > > >> > > > > On 15 March 2014 18:15, Ted Yu <yuzhih...@gmail.com> wrote: >> > > > > >> > > > >> There're several Jenkins jobs for hadoop. >> > > > >> e.g. >> > > > >> https://builds.apache.org/job/Hadoop-Yarn-trunk< >> > > > >> https://builds.apache.org/job/Hadoop-Yarn-trunk/510/changes> >> > > > >> https://builds.apache.org/job/Hadoop-hdfs-trunk/ >> > > > >> >> > > > >> Which module are you looking at ? >> > > > >> >> > > > >> Cheers >> > > > >> >> > > > >> >> > > > >> On Sat, Mar 15, 2014 at 11:00 AM, Omar@Gmail < >> > > omarnet...@googlemail.com >> > > > >> >wrote: >> > > > >> >> > > > >> > Hi, >> > > > >> > >> > > > >> > Is there a Jenkins builder server for Hadoop? >> > > > >> > >> > > > >> > I'm getting test failures when building hadoop-trunk. Is that >> > > > >> known\normal? >> > > > >> > >> > > > >> > Omar >> > > > >> > >> > > > >> >> > > > > >> > > > > >> > > > >> > > >> > >> >> -- >> CONFIDENTIALITY NOTICE >> NOTICE: This message is intended for the use of the individual or entity >> to >> which it is addressed and may contain information that is confidential, >> privileged and exempt from disclosure under applicable law. If the reader >> of this message is not the intended recipient, you are hereby notified >> that >> any printing, copying, dissemination, distribution, disclosure or >> forwarding of this communication is strictly prohibited. If you have >> received this communication in error, please contact the sender >> immediately >> and delete it from your system. Thank You. >> > >